• Artificial Intelligence (AI) may overtake large parts of current automation, making it redundant.

• Automation may tend to report the simple truths, not the key ones.

• If research buyers are replaced by AI systems, then the human element in the supply of MR may be less relevant.

• The ‘cost reduction’ mentality may spill over to tasks that can’t (yet) be automated – taking some good solutions off the table.

• Potential for a dominant supplier to appear and distort the market (like Google in online advertising).

• The life cycle of some new tools may be too short to make money.

• Risk of annoying customers (i.e. the customers of the research clients) and customer backlash

Automation Today, Three Big Trends

The three big trends in automation are:

Robots – devices that work without a person being there. For example, driverless cars and trains, automated hotel check-ins, drones, and robots in factories.

Platforms – apps and software that are disintermediating tasks and markets. For example, Expedia, ZappiStore, and Lift.

AI (Artificial Intelligence) – automation that tackles tasks such as analysis and creativity will mine data to avoid fraud, design research, and suggest business strategies.

Forecast for Automation in Market Research

AI has been relatively slow to take off in Market Research due to the lack of relevant tools. However, I think there are signs that this is about to change.

We will see several AI incursions into market research over the next five years:

Text analytics. This has been improving for years and we are now at the tipping point. With text analytics in place, the balance between open and closed questions (and between qual and quant) will shift. We know that open-ended questions are usually richer, but closed questions (e.g. a 5-point) scale are easier to process. When we can process the open-ends, we will encourage the use of them, rather than avoid them. We are already seeing AI-driven platforms like Remesh emerge – these use AI to analyse thousands of verbatim responses in real time, delivering new research approaches.

Image analytics. When researchers ask participants to take pictures, we normally only ask a few of them, and when we ask them to take a video, we limit them to 30-40 seconds. This is because it takes so long to process images and video using humans. But now, the tools to process images and video with bots are arriving, which will change the tasks that researchers ask participants to do.

Project design. This is likely to be achieved by creating a Q&A system with users that suggests research routes. Once a route is agreed, the bot will create the survey, commission the research, and interpret the results. This is close to what some existing products do (including many from ZappiStore), but it goes to the next step. Current systems have a limited range of options and the approach is relatively formulaic. The next step is to widen the choices and to make the interpretation more intelligent, utilising open-ended responses.

Complex Analytics. Data sets are getting larger and more complex. Data sets for things like customer satisfaction and brand tracking are also becoming more fragmented as people seek to shorten surveys, add Big Data, and interrogate social media. This sort of complexity needs AI to make it practical and cost effective.

Insights to Action.

‘Finding insights is only half the solution that businesses need. The other half is ensuring that insights are turned into action. This process starts by asking the right business and research questions at the outset, and is complemented by conducting good research and finding the story in the data. But organisations need help in ensuring that insights result in action. Many of the consultative (usually boutique) agencies (such as BrainJuicer and InSites Consulting) have been focusing on this for a while now, but it is becoming more widespread. Much of my work with clients has been around both finding insights and turning insights into action.

In 2017 I expect to see more research organizations develop (proprietary) approaches to turning insights into actions, we will see more clients invest time and money in creating processes for turning insights into action, and I expect to see several new applications to be developed to help implement findings from research’

‘For the last few years, implicit measurement has tended to be considered one version of neuro-based techniques. Indeed, it was often considered the less sexy cousin of things like EEGs and fMRI scanners. However, implicit is emerging as both a technique and philosophy in its own right. The reason for this is that implicit techniques and approaches are proving to be more applicable than the rest of the neuro-bag-of-tricks put together. Implicit can refer to specific tools such as implicit association testing, but it is also used as a general statement about valuing observation above questioning, or as Raj Sandhu said in a recent NewMR webinar ‘Task, don’t ask’ – linking implicit to observational paradigms. In 2017 I expect to see more and more research re-badged as implicit, including ethnography, passive data collection, conjoint analysis, semiotics, biometrics, and of course the neuro-based techniques.’

In 2017 talk about AI (Artificial Intelligence) will be everywhere, but it will be more smoke than fire. AI is coming, indeed AI is happening, but it takes time for something as radical as AI to be fine-tuned for market research and for it to become embedded. However, I would recommend everybody to keep an eye on it in 2017, and if you get a chance, make sure you try one of the growing range of projects that utilize AI. Amongst the areas that I expect AI to have an impact are: text analytics,project design, survey design, and data analysis.

Key recent impacts include digital-based data collection tools (fewer roles for interviewers), new approaches such as automated facial coding, MROC management, social media listening, and the use of apps with smartphone-based research.

Among the providers of research services, quotation protocols and sampling/quota management are highly automat-ed, as is survey creation and data processing. Projects increasingly use automation to enable participants to move from one aspect of a study to another without manual intervention.

Repeat tasks are being automated. For example, the fielding of Millward Brown’s Link test is now almost completely automated, saving enormous amounts of time – and subsequently money.

In some cases, automation has increased engagement. For example, in real-time tracking it has enabled the data gathering to become seamless (and painless) for the participants, leading to enhanced engagement.

Zappy Store

Automation refers to “the use of systems with minimal human intervention, reducing or eliminating unnecessary human labour activities and thus allowing people to focus on high-intelligence processes”. Automation in market research is being used for a while for single tasks only but never for the entire process. It’s ZappiStore, in collaboration with industry leaders, that for first time provides full automation services for the end-to-end research process, aiming for a more efficient and effective type of research.

Stephen Philips CEO at ZappyStore “Automation provides faster, cheaper and better solutions, promoting a better thinking of the research world with less waste on time and cost-consuming tasks”. Automation, he explains, means “more research with less budget, leading to greater impact on accessing more and more information”. Spending endless hours in the field, cleaning data, or checking tables is pretty much unnecessary when all these tasks can be done faster by a machine. Using human resources for high intelligence tasks is what is going to deliver growth. Researchers are not replaced, nor become less important but their roles are upgraded and are becoming a valuable asset for the organisation. Researchers are now free to focus on what it really matters: not “how to do things” but on “what things to do”.

So what are these things that really matter? Finding the perfect design for a study does matter. Determining the framework for a neuroscience study does matter. Spending endless hours cleaning data does not matter at all, because it’s just a computational series of steps. I could list hundreds of tasks to prove my point that it’s not about processes anymore, but it’s all about objectives. It’s now a critical point for every researcher to recognise the change and to adapt, seeing automation as a collaborator and not as a threat. Similarly to automation, revolutionary trends like social media and big data have landed nicely in our lives without many adoption obstacles, establishing a new era in communications, marketing and data generation.