Nine key questions about our methodology

We run the largest, most global independent developer surveys. How do we manage to reach 30,000 developers annually?

Methodology

At SlashData, we are very proud of our methodology and we aim at clarity, transparency and simplicity when it comes to describing it.

There are 9 essential and non negotiable qualities that run through our methodology model and guarantee its integrity. The following Q&A aims at guiding you through each one of them.

Question 1:

Where do the SlashData research data and insights come from?

We run the largest most global independent developer surveys - twice per year

Our Developer Economics research is based on biannual online developer surveys. Twice a year we reach out to 15,000+ mobile, IoT, Cloud and desktop developers from more than 140 countries. As a result we get to hear of the opinions and investment decisions of more than 30,000 developers annually. We take pride in running the largest and most global independent survey.

Magnitude

Question 2:

What do you mean "this is an independent survey"?

No vendor, community or other partner owns our surveys or data

Our survey respondents do not come from any single developer community - vendor-owned or otherwise. Instead, they come from more than 60 different outreach channels. Therefore, our results are not biased towards the mindset of any single community out there. In fact we go to great lengths to ensure to the degree possible that all different developer mindsets and geographies are adequately represented in our samples.

Impartiality

Question 3:

How do you get to a representative sample of 15,000+ developers?

60+ regional & media partners, translations in 8+ languages

We translate our surveys in 8+ languages, including French, Spanish, Portuguese, Russian, Korean, Chinese (Simplified), Vietnamese and Turkish. In that way we ensure that we also reach to non-English speaking communities around the world - in other words, that our results are not biased in favour of English speakers. We also work with 60+ regional and media partners to promote our survey. These range from student communities to professional developer forums and from small local communities (such as Meetups in Kenya) to large vendor communities (such as Amazon, Microsoft and Intel) - and we add new ones with every survey. In this way we get to speak to a largely diverse set of developer profiles all around the world.

Inclusivity

Question 4:

So you don’t run your own panel?

We do and we use panel responses to track developer evolution

Our developer panel counts more than 22,000 members from 110+ countries, new developers joining with every survey that we run. Approximately 10% of our sample every time comes from returning panel members. Our returning panelists provide us with a way to track evolution of developer behaviour (such as transitions from one platform or language to another, changes in motivations and career paths). We also use panel surveys to conduct qualitative research: Where greater clarity is needed the data from the large-scale surveys may be complemented with smaller panel surveys focused on specific demographics or technologies. These smaller surveys can delve deep into an area with open-ended questions to measure the qualitative aspects of software development.

Consistency

Question 5:

Why go to such lengths to reach developers and not just draw on your panel?

Surveying the same people repeatedly is extremely limiting

Surveying from the same pool of people repeatedly leads to results strongly biased towards the beliefs of certain panelists. If we were to solely rely on a set panel of people we would miss out on important developer behaviours that may be emerging in communities outside our panel. By reaching out to the developer population afresh each time and populating our sample with new developers, we make sure we don’t miss out on new technologies or new “types” of developers that have emerged. We also verify that our results hold, no matter what the sample is - in other words we can rest assured that the trends we show are true and not the result of just following the same people through our surveys.

Substantive

Question 6:

How do you incentivise developers to take your surveys?

Prizes, free content, fun, learning and a chance to have their say

We offer developer-specific prizes as part of a draw. We carefully choose prizes to be of interest only to developers (examples include hardware prototyping boards and IDE licenses) so that we don’t attract non-developers to our surveys. We also release free content that is of interest to developers, such as our State of Developer Nation reports - many respondents tell us that they participate to make these reports possible. We also provide each respondent with customised content in the form of a dashboard where they can see how they compare to other developers. Many developers also tell us they learn a lot through our surveys - and most have a good laugh with the memes we include every few pages. Finally, we also interview selected developers out of those who took part in our surveys and quote them in our reports. A chance to have their say matters to many developers.

Engagement

Question 7:

How do you know that respondents are not randomly clicking through the survey to get to the prizes?

We cleanse our data and check all responses for integrity

We thoroughly check our raw data set for integrity and unceremoniously throw out any suspicious-looking responses. We run tests based on - among other things - completion times, duplicate responses and consistency of answers to related questions.

Diligence

Question 8:

No matter what you do, there will still be sampling bias. How do you deal with that?

We weigh results by region, platform and segment to minimize bias

In all analysis it is important to be aware of unavoidable bias in sampling, and to mitigate against it. To that end we employ vigorous statistical tests and methods to identify and minimise bias caused by regional, platform or developer segment over- or underrepresentation in our sample. If for example we find that we have too many hobbyists in our sample - as compared to our estimate of the real percentage of hobbyists among developers - we ‘weigh down’ hobbyist responses, so that they don’t affect our results out of proportion.

Confidence

Question 9:

Do you survey mobile only developers?

No - We cover seven development areas

We believe that software is eating the world and we are committed to exploring this fascinating phenomenon in all its expressions across the several different types of development. We cover multiple development areas: Mobile, IoT, backend, desktop and most recently, with our 11th wave, augmented reality and data science. Last but not least, we also reach out to developers developing messaging bots.