This site is about everything digital, giving an update on new things as I learn

Tag: Google

Measure. Measure. Measure. Tracking the impact of a product is crucial if you wish to learn about your product and your customers. I’ve written before about the importance of spending time on defining the right metrics to measure, avoiding the risk of succumbing to data overload. That’s all well and good, but what do you do when the key things to measure aren’t so tangible!? For example, how do you measure customer feelings or opinions (a lot of which you’ll learn about during qualitative research)?

A few years ago, Kerry Rodden – whilst at Google – introduced the HEART framework which aims to solve the problem of measuring less tangible aspects of the products and experiences we create (see Fig. 1 below). The HEART framework consists of two parts:

The part that measures the quality of the user experience (the HEART framework)

The part that measures the goals of a project or product (the Goals-Signals-Metrics process)

HEART framework

Engagement – Measures the level of user involvement, typically via behavioural proxies such as frequency, intensity, or depth of interaction over some time period. Examples include the number of visits per user per week or the number of photos uploaded per user per day.

Adoption – New users of a product, feature or a service. For example: the number of accounts created in the last seven days, the number of people dropping off during the onboarding experience or the percentage of Gmail users who use labels.

Task success – This includes traditional behavioural metrics with respect to user experience, such as efficiency (e.g. time to complete a task), effectiveness (e.g. percent of tasks completed), and error rate. This category is most applicable to areas of your product that are very task-focused, such as search or an upload flow.

Does the product help achieve key customer tasks or outcomes? Why (not)?

What should we focus on? Why? How to best measure?

The HEART framework thus works well in measuring the quality of the user experience, making intangible things such as “happiness” and “engagement” more tangible.

Goals-Signals-Metrics process

The HEART framework goes hand in hand with the Goals-Signals-Metrics process, which measures the specific goals of a product. I came across a great example of the Goals-Signals-Metrics process, by Usabilla. This qualitative user research company applied the HEART framework and the Goals-Signals-Metrics when they launched a 2-step verification future for their users.

This example clearly shows how you can take ‘happiness’, a more intangible aspect of Usabilla’s authentication experience, and make it measurable:

Question: How to measure ‘happiness’ with respect to Usabilla’s authentication experience?

Goal: The overarching goal here is to ensure that Usabilla’s customers feel satisfied and secure whilst using Usabilla’s product.

Signals: Positive customer feedback on the feature – through a survey – is a strong signal that Usabilla’s happiness goal is being achieved.

Metrics: Measuring the percentage of Usabilla customers that feels satisfied and secure after using the new authentication experience.

The Usabilla example of the HEART framework clearly shows the underlying method of taking a fuzzy goal and breaking it down into something which can be measured more objectively.

Main learning point: The HEART framework is a useful tool when it comes to understanding and tracking the customer impact of your product. As with everything that you’re trying to measure, make sure you’re clear about what you’re looking to learn and how to best interpret the data. However, the fact that the HEART framework looks at aspects at ‘happiness’ and ‘engagement’ makes it a useful tool in my book!

Lawrence Burns is a veteran of the automative industry. Having worked his entire professional career in the car industry – in Detroit, the birthplace of modern car manufacturing no less – you might expect Burns to be apprehensive about ‘change’ and modern technology. The opposite couldn’t be more true of Burns, since he’s been an advocate for driverless cars for the past 15+ years, starting his foray into this field whilst at General Motors.

The book starts off with the story of the “DARPA Challenge” in 2004 and how this helped shaped learning and development with respect to driverless cars. Burns gives the reader a good close-up of the experiences and learnings from one of the teams that took part in this challenge. At this first DARPA challenge, every vehicle that took part crashed, failed or caught fire, highlighting the early stage of driverless technology at the time.

Bob Lutz, former executive of Ford, Chrysler and General Motors, wrote an essay last year titled “Kiss the good times goodbye”, in which he makes a clear statement about the future of the automotive industry: “The era of the human-driven automobile, its repair facilities, its dealerships, the media surrounding it – all will be gone in 20 years.” There’s no discussion that driverless cars are coming, especially that both car and technology giants are busy developing and testing. When I attended a presentation by Burns a few months agogo, he showed the audience examples of both self driving cars and trucks:

In “Autonomy”, Burns brings Lutz’ predictions to life through the fictitious example of little Tommy and his family. In this example, Tommy steps into a driverless which has been programmed to take him to school in the morning. Tommy’s grandma will be picked up by a driverless two-person mobility pod to take her to a bridge tournament. Burns describes a world where car ownership will be a thing of the past; people using publicly available fleets of self driving cars instead.

Together with Chris Borroni-Bird, Burns has done extensive research into the potential impact of an electronic self driven car, looking at metrics such as “total expense per mile”, “cost savings per mile” and “estimated number of parts”. Borroni-Bird and Burns provide some compelling stats, especially when contrasted against conventional cars. Reading these stats helps to make the impact of driverless technology a lot more tangible, turning it from science fiction or future music into a realistic prospect.

Main learning point: “Autonomy” by Lawrence is an insightful book about a driverless future, written by a true connoisseur of the car industry and the evolution of driverless technology.

Why do I keep coming across businesses that struggle to engage with their (prospective) customers, to learn about their needs and behaviours? Too often for my liking, I hear comments like:

“Marc, we’re a startup, we don’t have the time and budget to do customer research!”

“I’m not allowed to talk to customers.”

“In my old place, we used to have a dedicated user research team and they’d just give me their research report on a platter, after them having spoken to users.”

It therefore felt quite timely when a colleague pointed me in the direction of Michael Margolis, UX Research Partner at Google Ventures. Back in 2013, Margolis delivered a great Startup Lab workshop in which he covered the ins and outs of “User research, quick and dirty”. The recording of the 90 minute workshop is available on YouTube and you can find Margolis’ slides here (see also Fig. 1 below).

Margolis started off his session by talking about the importance of continuously learning about users, seeing things through their eyes. In a subsequent Medium post, Margolis writes that in his experience, startups will typically use UX research to achieve one of these objectives:

Improve a process or worklflow

Better understand customer shopping habits

Evaluate concepts

Test usability

Refine a value proposition

Two types of user interviews

It’s great to hear Margolis making a distinction between two types of interviews:

Usability: A usability interview is all about learning whether users can actually use your product and achieve their goals with it. Can users do it? Can they understand it? Can they discover features?

Discovery: Discovery type user interviews tend to be more contextual, and delve more into the actual user. Who? Where? When? Why? How? All key questions to explore as part of discovery, as well as the user’s existing behaviours, goals, needs and problems.

Margolis then talks about combining the two interview types and highlight two sample questions to illustrate this combination:

“How do you do things now?”

“How do you think about these things?”

The distinction between “usability” and “discovery” isn’t just an artificial one. I love Margolis’ focus on objectives, acknowledging that objectives are likely to vary depending on the type of product, its position within the product lifecycle and the learnings that you’re looking to achieve. I’ve found – at my own peril – that it’s easy to jump straight into defining user tasks or an interview script, without thinking about your research objective and what Margolis calls “North Star questions” (see Fig. 2 below).

How are people using existing/competitor products? What features are mots important and why?

What barriers hinder users from adopting <product>?

Sample usability questions – as suggested by Michael Margolis:

Can users discover feature X?

Are users able to successfully complete primary tasks? Why (not)?

Do users understand feature X? Why (not)?

In a similar vein, I believe it’s important to distinguish between problem and solution interviews. There’s a risk of your customer insights becoming muddled when you mix problem and solution interviews, especially if you alternate problem questions with solution questions.

In a problem interview, you want to find out 3 things:

Problem – What problem are you solving? For example, what are the common frustrations felt by your customers and why? How do their problems rank? Ask your customers to create a top 3 of their problems (see the problem interview script in Fig. 1 below).

Existing alternatives – What existing alternatives are out there and how does your customer perceive your competition and their differentiators? How do your customers solve their problems today?

Customer segments – Who has these problems and why? Is this a viable customer segment?

Main learning point: In his Startup Lab workshop, Michal Margolis, drops a lot of very valuable tips on how to best keep customer research quick and simple, whilst still learning the things about your customer and/or product that you’re keen to learn. So much so that Michael Margolis’ tips warrant another blog post, which I’ll share soon!

Once you’ve starting to think about possible solution – during Day 2 of the sprint – the next step is to take your huge pile of solutions and decide on which solution(s) to prototype. In the morning, you’ll review and critique the different solutions and select those solutions which you feel have the best change of meeting your long-term goal. In the afternoon, you’ll take the winning scenes from your ‘solution sketches’ and convert them into a storyboard. The goal behind this storyboard is to have a clear plan in place before you create a prototype to test with customers.

Decide

The main objective for the third day of your sprint is to decide on which solutions to prototype. In “Sprint”, Jake Knapp, John Zeratsky and Braden Kowitz suggest a number of techniques to optimise your decision-making process:

Art museum: Put the solution sketches on the wall with masking tape.

Heat map: Look at all the solutions in silence, and use dot stickers to mark interesting parts.

Speed critique: Quickly discuss the highlights of each solution, and use sticky notes to capture big ideas (see Fig. 1 for a breakdown of how speed critique works).

Straw poll: Eachpersonchooses on solution, and votes for it with a dot sticker.

Supervote: The Decider makes the final decision, with more stickers.

Fig. 1 – How speed critique works – Taken from “Sprint”, p. 136:

Gather around a solution sketch.

Set a time for three minutes.

The Facilitator narrates the sketch. (“Here it looks like a customer is clicking to play a video, and then clicking over to the details page …”)

The Facilitator calls out standout ideas that have clusters of stickers by them. (“Lots of dots by the animated video …”)

The team calls out standout ideas that the Facilitator missed.

The Scribe writes standout ideas on sticky notes and sticks them above the sketch. Give each idea a simple name, like “Animated Video” or “One-Step Signup.”

Review concerns and questions.

The creator of the sketch remains silent until the end. (“Creator, reveal your identity and tell us what we missed!”)

The creator explains any missed ideas that the team failed to spot, and answers any questions.

Move to the next sketch and repeat.

Rumble

A “Rumble” is a test whereby two conflicting ideas will be prototyped and tested with customers on the final day of the sprint. Instead of having to choose between two ideas early on, a Rumble allows your team to explore multiple options at once. If you have more than one winning solution, involve the whole team in a short discussion about whether to do a Rumble or to combine the winners into a single prototype. Knapp, Zeratsky and Kowitz suggest a good decision-making technique, “Note and Vote”, which you can use at any point throughout the sprint where you and your team need to make a decision (see Fig. 2).

Fig. 2 – Note and Vote – Taken from “Sprint”, p. 146:

Give each team member a piece of paper and a pen.

Everyone takes three minutes and quietly writes down ideas.

Everyone takes two minutes to self-edit his or her list down to the best tow or three ideas.

Write each person’s top ideas on the whiteboard. Ina sprint with seven people, you’ll have roughly fifteen to twenty ideas in all.

Everyone takes two minutes and quietly chooses his or her favourite idea from the whiteboard.

Going around the room, each person calls out his or her favourite. For each “vote”, draw a dot next to the chosen idea on the whiteboard.

The Decider makes the final decision. As always, she can choose to follow the votes or not.

Storyboard

Creating a storyboard is the final activity on the third day of the sprint. The goal here is to create a plan first before you start prototyping. You’ll take the winning sketches – see “Decide” above – and combine them into a single storyboard.

From experience, creating a good storyboard will take a good couple of hours. What makes a ‘good’ storyboard? Knapp, Zeratsky and Kowitz list a good set of rules to help you and your team to fill out your storyboard:

Don’t write together – Your storyboard should include rough headlines and important phrases, but don’t try to perfect your writing as a group. Group copywriting is a recipe for bland, meandering junk, not to mention lots of wasted time.

Include just enough detail – Put enough detail in your storyboard so that nobody has to ask questions like “What happens next?” or “What goes where?” when they’re actually prototyping on the fourth day of the sprint.

The Decider decides – You won’t be able to fit in every good idea and still have a storyboard that makes sense. And you can’t spend all day arguing about what to include. The Decider can ask for advice or defer to experts for some parts – but don’t dissolve back into a democracy.

When in doubt, take risks – If a small fix is so good and low-risk that you’re already planning to build it next week, then seeing it in a prototype won’t teach you much. Skip those easy wins in favour of big, bold bets.

Keep the story fifteen minutes or less – Make sure the whole prototype can be tested in about fifteen minutes. Sticking to fifteen minutes will ensure that you focus on the most important solutions – and don’t bite off more than you can prototype. (A rule of thumb: Each storyboard frame equals about one minute in your test.)

Main learning point: The third day of your sprint is all about ending the day with a storyboard that you can use as a starting point for a prototype, that you and your team will be creating on the fourth day of the sprint.

I personally find it very encouraging to see that more and more companies go down the route of experimentation and continuous discovery. Businesses are starting to realise that committing to a single solution upfront and implementing it in the hope that it will be successful can be a very risky strategy. “Sprint – How To Solve Big Problems and Test New Ideas in Just Five Days” builds on this change by introducing the concept of a 5-day sprint in which to identify problems, explore possible solutions AND get feedback from real customers.

Jake Knapp and two of his colleagues at Google Ventures, John Zeratsky and Braden Kowitz, have successfully applied ‘sprints’ for a wide range of companies, helping the likes of Pocket and Blue Bottle to tackle difficult problems in just five days. This is how the five days are broken down:

Monday (day 1) – ‘Start at the end’; agree to a long-term goal and pick a problem to solve during the sprint

The “Sprint” book contains a wealth of great techniques to utilise as part of a sprint. I want to do it justice and will probably devote a couple of posts to this great book. Before delving into each of the days of the sprint, let’s start by looking at ‘setting the stage’ before kicking off the sprint. It’s important to have the right challenge and the right team before you begin a sprint:

Challenge – As readers of my posts might know; I’m quite obsessed about understanding the problem(s) worth solving before exploring solutions. I therefore believe that picking the right problem or challenge to solve is absolutely critical to a successful sprint. Knapp, Zeratsky and Kowitz suggest three challenging situations where sprints can help: high stakes, tight deadlines or when you’re simply stuck.

Team – The key thing when assembling a sprint team is to get a ‘Decider’ in the team; someone who’s in a position to make important decisions. This can be the CEO or another important stakeholder. I like how the book provides a number of arguments one can use when a Decider is reluctant to get involved in the sprint (see Fig. 2 below). You should end up with a well balanced team, made up of people who can implement as well as subject matter experts (see Fig. 3 below).

On top of picking a team, it’s also important to have a designated facilitator who can manage time, conversations, and the overall process. Naturally, this can be someone from within your company or someone external. For example, I know lots of digital agencies that facilitate sprints as part of a piece of work for their clients. As much this is beneficial to the client, this also helps the agency by creating a shared and robust understanding of what’s going to be built and why.

Rapid progress – Emphasise the amount of progress you’ll make in your sprint: In just one week, you’ll have a realistic prototype.

It’s an experiment – Consider your first sprint an experiment. When it’s over, the Decider can help evaluate how effective it was.

Explain the tradeoffs – Show the Decider a list of big meetings and work items you and your team will miss during the sprint week.

It’s about focus – Be honest about your motivations. If the quality of your work is suffering because your team’s regular work schedule is too scattered, say so. Tell the Decider that instead of doing an okay job on everything, you’ll do an excellent job on one thing.

Main learning point: I recommend everyone doing a ‘sprint’ before committing to a specific solution. “Sprint” is a great book, with a lot of helpful guideance as to how to best solve big problems in five days. I’d argue that some of the techniques to use as part of sprint shouldn’t be constrained to a 5-day period or at the start of a piece work; I’ll outline in the coming posts how some of the sprint exercises can be used on a continuous basis.

As part of the Mobile Academy curriculum, I recently attended a class by Priya Prakash on “design principles”. Priya is a very experienced designer and has founded Design for Change, a London-based urban experience design studio.

Priya started off the session by explaining that design principles describe the experience of core values of a product or a service. Design principles help in making decisions on your product. She referred to a great definition of design principles by Luke Wroblewski (see Fig. 1 below). The important part of Luke’s definition is that all decisions can be measured against design principles.

“Design is what you decide not to do” was one of the key points that Priya raised in this class. It’s all about doing less and simplifying things. She talked about Spotify and Google Glass as good examples in this respect:

Content first – Focus on the content, and remove any unnecessary user interface elements.

Get familiar – Even though there is a clear distinction between a “lean forward” mode (Spotify desktop app) and “lean back” mode (Spotify mobile app), there’s a unified design language which has been executed consistently, irrespective of the device that you access Spotify from.

Don’t get in the way – Google Glass is designed to be there when you need it and to be out of the way when you don’t. The goal is to offer engaging functionality that supplements the user’s life without taking away from it.

Keep it relevant – Deliver information at the right place and time for each Google Glass user.

Personality – For example, the Pitchfork app has a magazine like feel. It’s about understanding what the content is and translating this into appropriate behaviours.

Responsive – Priya talked about the Clear app as being very responsive, explaining how this app gracefully expands or contracts.

Context – Motion should give context to the content on screen by detailing the physical state of those assets and the environment they reside in.

Emotive – This principle is all about evoking a positive emotional response. This kind of response can be triggered by wide range of user interface elements, for example smooth transition or a nice animation. Yelp‘s app is a good example in this regard.

Orientation – Motion should help ease the user through the experience. The “orientation” principle means that motion should establish the “physical space” of the app by the way objects come on and off the screen or into focus. The key is to get the flow of actions right, guiding the user on her journey and make sure she doesn’t feel lost or confused. Mobile apps like Yelp and Evernote do this pretty well in my opinion.

Restraint – Keep it simple! Similar to the abovementioned “orientation” principle, it’s important not to bombard the user wity too much animation or confuse them with too many interactions to choose from. This is one of the reasons why I’m so a big fan of single purpose apps; I like the simplicity that they offer and the level of design restraint that they tend to apply.

Main learning point: I learned a lot from Priya Prakash’s class on design principles, particularly with respect to motion user interface design principles. Design principles can provide valuable guidance for the design of any software product or service and should therefore not be taken lightly. Thanks to Priya for a great class!

“Design principles are the guiding light for any software application. They define and communicate the key characteristics of the product to a wide variety of stakeholders including clients, colleagues, and team members.”

“Design principles articulate the fundamental goals that all decisions can be measured against and thereby keep the the pieces of a project moving toward an integrated whole.”

Fig. 2 – What makes a good design principle? – Taken from Priya’s lecture at the Mobile Academy on 14 October ’14:

I guess we all know how frustrating it can be to have to sit in meetings that just feel like a waste of time or that could have been dealt with in 30 minutes (instead of 3 hours). I know that there are quite a few apps out there which help us to run more productive meetings, but I decided to focus on Do:

How did this app come to my attention? – I got an alert from Product Hunt about Tools for Product Managers, promising me a list of “the tools the pros use”. Do was only ranked 10th on this list, but I guess it was this comment from one of the Product Hunt voters, that intrigued me the most: “I was a Yammer PM. Do.com is the meetings platform I wished I had.” Especially given that it came from a guy who used to be at Yammer – who are all about collaboration within the enterprise – this comment made me want to find out more about the product.

My quick summary of the app (before using it) –Do helps you to have more productive meetings; I therefore expected a tool which helps its users to make their meetings as efficient as possible. The tool doesn’t yet seem to be available on iOS or Android, only on PC.

Getting started, what’s the sign-up process like? –I have to sign up to use Do. At present, Do only seems to support Google users; all non Google users will be notified as soon as they will be able to sign up (see Fig. 1 below). Once I’ve selected my Google account, I get presented with a permissions screen (see Fig. 2 below). I click “Accept” and my personal dashboard appears. All fairly straightforward.

How does the app explain itself in the first minute? – The default page of my dashboard shows a simple timeline with meetings on the relevant dates and times (see an example in Fig. 3 below). To be honest, I felt a bit underwhelmed at first , thinking “is this it!?”. However, the subsequent overlay which consisted of six ‘how to’ screens was quite useful, explaining in a simple but effective way how to best get started on Do (see Fig. 4 below).

How easy to use was the app? – Using the tool felt very intuitive and easy. The layout of the dashboard is clear and easy to understand. Adding a new meeting to the dashboard felt no different to doing the same thing in Google or Outlook (see Fig. 5 below).

How did I feel while exploring the app? – Like I mentioned above, exploring Do felt incredibly easy and intuitive. The signposting used in the tool is self-explanatory and the navigation options have been kept to a minimum. A quick click-through on an individual agenda item highlighted a key purpose of Do; the ability to create and share a meeting outline, making it easy to collaborate around meeting goals and agenda items (see Fig. 6 below).

Did the app deliver on my expectations? – Yes, it did. I felt a bit underwhelmed at first, expecting Do to provide more, ‘less obvious’ features. However, whilst playing with the application, I discovered features like “Invite” and “Takeaways”, which I believe are missing from most standard diary / meeting applications.

How long did I spend using the app? –A few days to start with, but I expect to be using it a lot more in the future!

How does this app compare to similar apps? – I had a quick look at MeetingHero which serves a similar customer proposition to Do. At a first glance, MeetingHero seems a bit less advanced and intuitive in comparison to Do. MeetingHero is, however, available as an app on iOS which means that the app can be used on the go.

Main learning point: Do is a straightforward and easy to use meeting app. I like its interface and its key features; the app makes collaborating around meetings very easy. It will be interesting to see how Do will perform in already crowded marketplace, with apps and systems that enable similar things. I’m now curious to see what the mobile version of the application will look like!