Methods & Practices

March 25, 2014

In a previous article, I introduced 5 tools for effective behavioral design. These included motivation/ability matrix, triggers, routine/ritual mapping, rewards, and flow/challenges. In this post, I will sketch a behavioral design using these tools.

Using the collaboration workflow as an example, let’s sketch a couple of concepts on "reviewing a design online" workflow. Our manager persona is used to offline meetings, annotating drawings with red pen, and arguing, negotiating, and simply hustling over a design proposal in a closed loop of his team. He is hesitant about the idea of online collaboration and opening the floor to the whole company. How might we introduce a design tool that helps him change his reviewing habits and make them more fruitful and effective?

Figure 1. Manager Persona likes his traditional ways of collaboration

We can start with our persona. What are the motivation and ability issues for him? We might assume that he doesn’t have huge ability issues since reviewing online is a low barrier skill.

Motivation-based design

How can we increase his motivation to adapt to this new behavior? Motivation, according to Dan Pink, depends on autonomy, mastery, and purpose. How might we increase his motivation, touching one of these dimensions? Autonomy is about sense of control, whereas mastery is about sense of achievement, and purpose is about the desire to pursue meaning in a specific situation.

The Motivation/Ability Matrix can be used as a mapping tool to identify possible collaboration motivators. Seeing that his suggestions are heard and implemented by co-workers will be a motivator for the user so he can feel the sense of achievement. If he is appreciated and mentioned in the final design, he is purposefully part of something bigger than his daily workflows. Finally, if he has the ability to edit and delete any of his comments in the collaboration platform, he would feel in control.

Figure 2. Being heard and complimented helps him to feel a sense of purpose and being part of a community

Motivation provides intrinsic dimensions to influence our behavior design, but how about external motivators? We can think of triggers and rewards that might address one of the three dimensions. For instance, use of points, levels, and completion rates might be good rewards and keep the user invested in the product. We might then introduce a collaboration mastery path and a collaboration currency within the product, where the user gets more points and levels as he collaborates more.

We also need to be careful about rewards. The user can easily fall into boredom once he has routinized the rewards. Rewards in the long term should lead to a flow experience, where the user gets challenges that balance ability and motivation.

Trigger Design

How do we situate triggers and challenges in our personas’ daily workflows? This brings us to our routine/ritual mapping tools. Based on user engagement, the designer can organize participatory design sessions with the user, map their daily routines, and dive into any rituals. Using a white board, post-its, and big sketch-pads are useful for mapping activities. You can easily dig into routines as they have lower barriers. They tell us when he works solo, collaborates, takes breaks, etc. Once we know more about these, we can then decide on the type of the triggers, their frequency, and their interval. For example, the designer realizes that our persona uses his commute time to check what others are doing in the company via his smart phone. Such an insight is valuable to consider when and how to nudge our user with triggers such as push notifications or embedded posts within their feed.

Rituals are a bit different; the designer needs to dig deeper with “why” questions, and identify where the user invests his precious time and attention. It helps understand what he really cares and values. We may find that he really values one-to-one interactions in collaboration over online encounters. For him, design reviews are ritualistic: how he uses his red pen, the way people listen, and the way he passes back and forth comments blended with both seriousness and humor. This might be a strong insight for the designer to inform his design. He may come up with the option to have real-time collaboration meetings within the suggested tool and even suggest a virtual red pen be passed between the participants to annotate things over the screen.

As illustrated here, behavior change tools bring more fruitful and deeper insights about users and why they may or may not be adapting to certain behaviors we have designed into products. These tools are complimentary to each other and in certain contexts one tool needs more priority than others.

March 18, 2014

Social media makes it easy for people to post news, discuss ideas, ask questions, and share links. Seen through the eyes of learning content development (LCD) teams, social media is a great way to make connections with the consumers of our learning videos and help topics. We can discover how well our content is working for our audience, discuss ideas to improve it, respond to questions, and boost information discovery by sharing links to new content. We even have the potential of expanding our sphere of influence and reaching a new audience.

So, when we were preparing to launch a new online help site, my LCD team embraced social media…and helped our customers make the shift from “content you find” to “content that finds you.”Here’s what we did…and what we think you can do, too:

1. Choose an outlet

We chose Twitter to acclimate our social-media-savvy customers to our new help platform and learning content offerings. Through our Revit Help Twitter account, we pointed people to the content they needed, answered questions as they came up, and evangelized potentially unknown tips and tricks. With its compact messaging, Twitter worked well for us. Other microblogging outlets, such as Facebook, would also be suitable for communicating in short bursts with a user community.

2. Tune in

We began by tuning into the helpful tips that advanced users of Revit were posting on their blogs and in user forums. When posts revealed shortcomings in our online help content, we improved our help topics, and then pointed people to the new material through our Twitter account. When posts shared fun Revit-related news, ideas, and project results, we pointed our followers to the posts with tweets. When a new version of Revit was released, we began tweeting help topics for new features, while continuing to share our users’ content.

Using this blended approach, we were able to provide a variety of useful information to our customers, from entry level users to advanced users. And, as the Revit Help Twitter account grew, so did the traffic to our new online help site, as it was retweeted and shared through other social media.

3. Experiment, analyze, adjust

As our followers increased in number, we knew we could increase our impact by tweeting at different times to entice users in different time zones and with different work schedules. We experimented by scheduling tweets at different times of the day and week, including weekends, to maximize visibility. We used flexible social media management software to handle the scheduling, and found that both Sprout Social and HootSuite were good choices. Our experiments paid off. We found the optimal tweet times for our community, earned a big increase in followers, and reached a new audience: university students who were learning Revit on the weekends.

In addition to automated scheduling, social media management software gave us access to a solid set of analytics tools. In addition to seeing aggregated click-through and retweet rates, we were able to see user demographics, including gender, age, and geographical location.

4. Listen and learn

We regularly followed our hashtags, commented, and contributed to our users’ conversations. We learned that social media doesn’t just facilitate communication, it requires it for success. Contributing to customers’ conversations gave us an opportunity to better understand their needs and to identify opportunities to meet some of those needs with new or enhanced learning content.

5. Have fun

As technical communicators, we’re driven to help customers, so what could be better than helping them in a fast, collaborative environment? They learn from us, we learn from them, and (maybe best of all) they learn from each other:

For LCD teams, engaging with customers on social media can be powerful and satisfying. This also satisfies the needs of Revit users to connect with Autodesk.

March 11, 2014

While designing software for professional domains like engineering, urban planning, and architecture, interaction designers and developers use tools like personas, workflows, and detailed use cases help the designer capture the building blocks of UX.

However, these tools might fall short when introducing new emerging behaviors, or framing the changing technological and social paradigms. To understand and design experiences around the changing and the emerging, there is a need for more nuanced and developed perspectives than just capturing what's existing out there.

Behavioral design offers a rich set of tools that designers can use to tackle with domain specific work habit challenges. Here, I will present 5 of these tools. In a follow-up post, I will detail a design for how we might use them to stage a persuasive intervention based on a collaboration workflow.

1. Motivation-Ability Matrix

Figure 1: B.J. Fogg’s behavior model

Conceptual matrixes are useful for designers while framing their problem as well as their solution space. B.J. Fogg provides a simple matrix for his behavioral model, and suggests two dimensions to a behavior change; motivation and ability (see Figure 1).

Behavior change happens when the individual with the right motivation and ability threshold is nudged by external triggers over a designated time. Someone might have the motivation but not the ability, or vice versa. Contemplating domain experts, a designer can easily find graspable hooks for both. The sweet spot between the two defines where the triggers can intervene.

2. Triggers

Figure 2: Hooked model, on triggers and rewards.

Nir Eyal developed an actionable framework on triggers called Hooked that follows a four step process, including trigger, action, variable reward, and investment (see Figure 2). He argues that to build a habit, designers need to introduce triggers, and reward the user in a dynamic manner (each time getting a variety of the former reward). Once the user begins investing in the product, she will come back and use the product more often.

Triggers might take different names, such as cues, signals, or nudges. These can be in the form of visual, haptic, and behavioral nudges. Mobile apps that push updates can be a good example of a trigger. Trigger design needs to be carefully crafted and shouldn’t overwhelm the user’s cognitive load. People are easily annoyed by triggers if they are too insistent, hard to bypass, and attention hungry. Trigger design should take subtleness and peripheral attention as its principles.

3. Routine-Ritual Maps

Figure 3: Ritual & Routine Maps

Contextual design emphasizes the importance of activities and situation in approaching habitual change. In a paper I have published, I’ve focused on situational dynamics, and mapped out individuals’ daily routines and rituals for habitual change. Routine maps provide the touch points of an existing habit and its potential potholes, whereas ritual maps unveil user emotions and value landscape. Ritual also shows the designer where a user prefers investing in their time, what their attention and awareness capital goes to. Using both, the designer can look for sweet spots where the user’s attention and awareness are more open to possible design interventions.

4. Rewards

The idea of rewards goes far to reinforce the experiments of B.F. Skinner. He discovered that people’s behaviors are changed based on external reinforcements. Gamification is a variant of behavioral design, defined by Gabe Zichermann as “the process of using game thinking and game mechanics to engage users and solve problems.”

Zichermann uses a rewards model involving status, access, power, and stuff (SAPS). He states that status is what people really appreciate as reward. Eyal suggests using variety since people can easily be unmotivated and lose interest with fixed conditioning. Michael Wu mentions the importance of the timing and schedule of rewards. For instance, he suggests a fixed-interval schedule as an effective method when activity needs to increase near deadlines.

5. Flow Principle

Figure 4: Csikszentmihalyi’s Flow model

Rewards, however, may not lead to behavior change unless the designed interventions are challenging enough for the target audience. This brings us to the flow principle, developed by Csikszentmihalyi in his seminal book Flow. Flow happens when an individual is so immersed in an activity that she forgets the passing of time and feelings.

This happens when the individual has a challenge that’s both complex and doable within her given skillset. This aligns with the motivation ability matrix mentioned earlier. Triggers should lead to challenges that can balance a sense of control and mastery as well as the curiosity to pursue their habitual change.

February 28, 2014

The critical incident technique (CIT) was first described by Flanagan (1954), as a method for collecting negative and positive incidents that contributed to the success or failure of a task.

Critical incidents are brief and memorable descriptions of actions that a person or group performs in particular situations that lead to either effective outcomes (successes) or ineffective outcomes (failures or near misses).

Ideally, critical incidents are real incidents reported by actual users doing real work in their normal work environments.

The word “critical” often refers to a severe usability problem or a clear success. But sometimes a critical incident is something relatively minor, like an error message on a new system that forces you to waste time debugging when your other tasks are more important.

November 19, 2013

Software designers repeatedly face the challenge to make their products usable. One important and often overlooked principle is to make their products simple, which is harder than you might envision.

A simple software application is easier to learn. It makes the software user feel in control, reduces uncertainty while making their actions more deliberate, and efficiently accomplishes tasks. Simple software applications are, in my opinion, also more pleasurable to use.

Complex software applications are harder to learn and usually get more complex, making it increasingly difficult for mainstream users to differentiate between the core features (most used) and the peripheral ones (less used). Experienced users are often surprised to discover newer, more efficient features (several versions after their introduction into the application), because the older variant of the feature that they use daily, co-exists with the new one.

If simplicity isn’t considered as a key component of the design process, then software applications become increasingly complex as more features get added. We’re really good at adding new features but not so good at eliminating or retiring stuff.

Giles Colborne has thought a lot on the issue of simplicity as it relates to usability and proposes four basic strategies for designing for simplicity in his book, Simple and Usable.

Remove – Eliminate unnecessary or obsolete features entirely from the interface so the essentials that are core to your target audience remain. This isn’t an easy task, and will often be met with resistance both with your customers and internally with a variety of stakeholders. However, if you track feature usage in your application automatically, you’ll hopefully have a set of data analytics that can aid in making the business decision for a feature’s removal less emotional and more objective.

Organize – Arrange items into manageable groupings or “chunks”. This effectively puts multiple related items into categories thereby reducing the number of things a customer has to view or remember at any time. Items can be organized by related purpose, hierarchy, layers, colors, size, location, alphabetically etc. Organize is a frequently used strategy to make software applications simpler and easier to use.

Hide – Place features out of immediate view but still make them easily accessible. Features that are used infrequently are often good candidates for being hidden. For example, preferences are something that can usually be hidden since they are used so infrequently. Features can also be designed to progressively disclose more controls to further extend a simple feature so it suits the needs of expert users.

Displace – Move features to another location entirely. In his book, Colborne uses the DVD remote control as an example, whereby instead of having many “hard” buttons on the remote, it displaces these features to “soft controls” on a menu on the TV screen, simplifying the remote as a result.

Displacing may involve moving some complex features from a mobile device to the cloud, thereby reducing the complexity of the mobile application and take advantage of the strengths and power inherent to the cloud. Displacement can also involve combining related features into one general tool. For example, a tool that combines move, rotate, and scale capabilities would be efficient in a 3D software application since those operations are often done in conjunction with each other.

Which of these four strategies could be employed to simplify this toolbox mockup?

One solution groups and hides tools. The icons displayed below the dashed line indicate hidden tools accessed from a drop-down list containing other similar tools for that category.

Organization – A default icon from each grouping was chosen to represent each category based on what a typical software user might require.

Hide - A small triangle graphic was added to the lower-right corner of each icon position to indicate that additional hidden tools are available. This is a common convention that many existing applications employ.

Remove or Displace - The gear cog icon was removed entirely. The preferences it represents are accessed by incorporating the preferences in each tool or by displacing the preferences feature to a suitable menu/location within the user interface.

Simple products have repeatedly been proven to have a profound impact in the market place. It’s our job as designers to make a case for making products more usable and simpler whenever and wherever possible.

Practice Giles Colborne’s strategies by simplifying another of his examples.

November 12, 2013

What do you do when your product is used by an animator for
a small game studio, a marketing manager for an aviation/engineering company,
and a visualization specialist for a massive infrastructure corporation? How do
you identify and prioritize requirements for such a diverse set of users? That
is the challenge we face with 3ds Max, and to help solve it we invited twelve of
our most vocal customers to come to our office in Montreal. We bribed them with the promise of good food,
city life, and the chance to contribute to the design of a key application in
their workflow.

We wanted to give 3ds Max users from a diverse set of industries
the chance to tell us what they need in the product. We were also curious to see if they could all
agree on what Autodesk should focus on for the next few years. Would each camp stand their ground for their
team? Would some kind of consensus be
attained? Let’s see what happened...

At first, we gave users some time to “vent” in an unscripted
fashion. Then we got down to business
and asked them to write down requests for 3ds Max on post-its - as many as they
could fill for 15 minutes. This produced
a LOT of requirements and feature requests.
After unifying the duplicates, we conducted an open-sort affinity
grouping exercise with everyone chiming in.
This went surprisingly well, with people quickly settling on a common
set of categories into which all of the post-it ideas could fit.

This wall of ideas was great, but far too big for us to
tackle as a group in the brief time we had together. We needed a way to reduce
the number; but how could we identify the most important ones? By asking our users to vote! We gave stickers to each participant to put
on the ideas/issues they wanted us to work on the most. Knowing human nature, we devised a simple
mechanism to avoid people voting only for their ideas: only three stickers out
of twelve could be used to vote for their own ideas (they were color-coded and
initialed…). Seven stickers could be
used to vote on other people’s requests, and two red stickers to veto requests
that made no sense to them. The voting
was a success except for one thing: no one used their red stickers. It was probably
because their initials were on them and they did not want to offend others, or
that they could not bring themselves to remove a feature that may be vital to
someone else. Was this the sign of a
group dynamic emerging?

The post-its with the most votes were singled out, combined,
and grouped again to identify four themes to be tackled in the next step.
Interestingly, two themes were more industry specific, while the other two were
related to UI, usability, and features that could be beneficial to all. Together, the group identified a series of
requirements to create a Design Brief for each theme.

Then the main group was divided into four breakout teams,
which also included developers and QA. Simulating
a design charrette, each group took their design brief and worked to further
refine the requirements. High-level
design solutions were sketched. One
representative from each group then presented the work to the whole group,
where everyone worked to refine the concept and clarify the goals and
requirements. In a second breakout, the
design was developed with more detail, incorporating the feedback from the
group.

In the end, this participatory design activity helped the design
team identify the most important areas to address in future releases. But, it
also served to make these vocal users aware of the process we have to go
through when choosing what to work on. We may not have achieved complete consensus amongst
everyone in the room, but we did discover a great way for the many voices of
3ds Max to be heard.

November 05, 2013

The monetary method is an individual or group technique for
prioritizing requirements by asking users and stakeholders to “buy”
requirements from a “wish list” given a fixed (imaginary) budget of say
“$1200”. Each requirement is described briefly, and a relative cost is
associated with developing that feature.

This method for prioritizing requirements forces potential
users to think carefully about what requirements are most critical. You can do
this online without play money by just sending out the list and asking people
to choose up to $1200 of items by checking cells on a questionnaire (and
setting things up so no one can overspend).

Conte (2004) uses a variation on the monetary method as part
of a rapid task analysis. Conte had a group of users write down, in 10 minutes,
all the tasks that define their work on index cards - one task per card. The
users then grouped the cards into different categories of work and listed task
frequency (low, medium, high, really high) and task importance for each item
(low, medium, high). Then the participants were given $1000 in fake $100 bills
and everyone was asked to pin one or more of the $100 bills on the tasks they
wanted the new product to support.

When to Use: You can use the monetary method to:

Prioritize requirements

Learn what general sets of requirements,
features, content, or other items provide the most value to different groups

Get users to make clearer distinctions about
what requirements are essential

Make tough trade-offs when you need to drop
features

Strengths and Weaknesses:

+ Low cost, and minimal training

+ Fun and engaging

+ Can be conducted in person or online

+ The process of asking people to buy a set of
requirements on a fixed budget provides a clear differentiation between
important and “nice-to-have” requirements

+ Data analysis is simple

– You need a large sample of potential users

– Some companies worry about doing this online
since it could reveal something about future products to competitors

– This method gets more complex if you have to
consider dependent requirements

Procedure:

The basic monetary method procedure is:

Describe a small set of core requirements that
will definitely go into a new version of software. These “must have” requirements are guaranteed
and not included in the list of requirements that users can buy.

List a set of proposed requirements with clear
descriptions and the relative costs to develop those features. For example,
features that were easy to develop (e.g. 1-2 weeks) would cost $100, features
that required a moderate development effort (e.g. 3-4 weeks) would cost $200,
and features that were complex and difficult to develop (e.g. more than 4
weeks) would cost $300. The figure below
is a monetary method form that contains core requirements, instructions, and example
requirements.

Brief the participants about the meaning of the
relative costs. For example, you might
say that $100 requirements take 1-2 person-weeks to develop, $200 requirements
take 2-4 person-weeks, and $300 requirements take more than 4 person-weeks of
development effort.

Give the participants play money and tell them
that they could “buy” any combination of features as long as they don’t spend
more than the allotted funds.

Have users buy a requirement by circling (or choosing if
online) the value in the Relative Cost column and put the money for that
requirement aside. The amount of play money will depend on the number of
features you present to the participants.
If you gave each participant $1200 in play funds, this means that they
could buy four expensive features at $300 each, six medium priced $200 features, or various combinations of low, medium, and expensive features – as
long as they don’t go over $1200. I generally give people enough money to buy
three to five of the major features, though you can vary the amount of money to
restrict the number of choices. Another
rough rule for assigning funds for buying requirements is that participants
can’t buy more than about 30% or so of the total number of requirements with
their funds.

Tabulate the results.

References:

Conte, L. (2004). The color
of money – an agile technique for the prioritization of requirements.
Proposal to the Boston UPA Miniconference. Natick, MA.

Gray, D., Brown, S., & Macanufo, J. (2010). Gamestorming: A playbook for innovators,
rulebreakers, and changemakers. There
is a method in this book called the “$100 Test” that is similar to the monetary
method.

October 29, 2013

Ed de Guzman and I attended HCI International to present some of our UX work on visual design, and to see what we could learn from international scholars.

Our own presentation was a paper and poster on “Desirability Methods for Evaluating Visual Design”. In essence, we advocate the use of some innovative research techniques to understand how users make meaningful choices among visual design concepts. Fortunately, for our study we had access to a wealth of iconography, imagery, and logo design options from Autodesk’s brand redesign earlier this year.

Our study used three scaffolding techniques to assist end-users in articulating their preferences and perceptions of visual branding:

The Think Aloud method allows users to review visual design alternative layouts and accomplish a task. While doing so they are encouraged to think-aloud about what they are observing. This is the least structured technique.

The Visual Design Card Sort (de Guzman & Schiller, 2011) method is a modification of the Microsoft Product Reaction Card deck from their Desirability Toolkit. This technique allows users to choose three to five words that represent their interpretation of a visual design concept.

The Visual Design Mad Libs method involves showing visual design alternatives to users and asking them to complete a structured sentence such as “This logo is <company name>, it is a <describe logo> because <explain why the design is appropriate for the company>”. This is the most structured technique.

We conducted three studies, using one, two or three of these techniques to determine which were most effective. In summary, we found that more structured methods produced more specific, actionable results for the visual design team. If you are interested in seeing the full poster, let me know.

Elsewhere around the conference, Hiroshi Ishii, of MIT Media Lab, delivered an excellent keynote address on the topic of “Defy Gravity: The Art of Tangible Bits”. Referencing some fascinating tangible UIs such as Ping Pong Plus Plus and the I/O Brush, he eloquently described how devices are like faucets of information. Just as water evaporates in physical landscapes, information is reused and curated in digital places. Ishii challenges the HCI community to focus on radical atoms of “future dynamic materials that are computationally reconfigurable”.

Ishii’s dream of interactive tangible interfaces meshes beautifully with Autodesk’s vision of democratizing design and engineering to help people imagine, design, and create a better world. With the advent of 3D printable circuitry there will be many powerful ways to create these technologies using Autodesk tools.

Additionally, we attended sessions on

Design, ergonomics, and usability

Cross-cultural design

Human aspects of information security, privacy, and trust

A session from Chi-Hsien Hsu et al looked at translating the PAD emotional state model into Chinese, for a cross-cultural understanding of pleasure, affect, and desirability. Autodesk as a global company is interested in this, and it’s a personal interest of mine.

Other sessions covered perceptions of usability and brand attributes. For example, a paper by Tareq Ahram et al compared competitor products using 13 sensibility words. This gave us inspiration to learn more about how to compare Autodesk offerings, such as Autodesk 360, to their competitors. Similarly, Min-Xian Sun et al presented an interesting paper on similarity and dissimilarity pairs to study user perception.

Finally, we are very aware that users’ trust of cloud services, like Autodesk 360, rests soundly on strong privacy and high availability. These topics were addressed in several great sessions, like those led by Kathleen Hogan and John Bustard.

We continue to look to the academic community and practitioners of HCI to find the best practices to make Autodesk 360 compelling and engaging for design and engineering users. What are your thoughts on marrying user research and visual design? Have we done enough or too much? Leave us a comment below!

September 26, 2013

In my ten years as a UX designer, I have used many user interface prototyping tools, ranging from paper and pen, to Microsoft PowerPoint, real code and everything between. These days I tend to rely heavily on PowerPoint to aggregate mixed media such as scanned sketches, screenshots and output from other prototyping tools like Balsamiq. There has been an explosion of UI prototyping tools recently and I have tried many of them. I am fairly hard to please, however I recently came across a new player in this space that impressed me with its balance of simplicity, flexibility, usability, features, and price: Indigo Studio from Infragistics. Indigo Studio runs on Mac AND Windows and the file format is compatible between the two versions. In this video I highlight some of the basic features and show examples of a recent project I used it on. How much does it cost? Version 1 has been free until recently. Version 2, which was just released with the iOS pack and other features has gone up significantly: US$495/user/year. But if you buy before October 31, 2013 you can get it for $99 with the promo code INDIGO42Q.

September 10, 2013

In our product design
and development world, UX practitioners will often look at competition with
traditional marketing glasses. We compare our product with similar products in
terms of features (current and future) and branding qualities like desirability
and customer satisfaction. At UxPA this year, I attended a presentation by
Beverly Freeman from eBay titled Competitive
UX Intelligence: A Primer that proposes a new and interesting way to
include user experience in the competitive intelligence mix.

In addition to product,
strategy and brand lenses, Mrs. Freeman argues that a UX lens can “enrich the
story” of our competitive intelligence analysis. Through the UX lens, she encourages us to look not only at the direct user experience of a competitor
product but also at the experiences that are around and having impacts on your
product UX, from start to end. She categorizes those experiences into 4 types
(in orange below).

An upstream competitor is “someone or something that makes people choose
not to use your product because of what they have to deal with before using your product”. Drive-thru Starbucks or grocery stores with
close-to-entry parking spaces that are reserved for family are examples she
gave to illustrate competing upstream user experiences.

A downstream competitor is “someone or something that makes people
choose not to use your product because of what they have to deal with after using your product”. As an example of that type of competition,
Mrs. Freeman talked about how some cloth baby diapers providers are offering a
pick-up service to avoid environment-conscious parents to have to wash their
darlings’ dirty diapers.

A companion competitor is “someone or something that makes people
choose not to use your product because of what they have to deal with while using your product”. Good examples discussed by the presenter are
how ladies’ fashion stores have put comfy chairs beside fitting rooms for a shopping friend or spouse. Or just think about Ikea Småland, where kids play
while you shop.

An analogous competitor is “someone or something in a different domain
that provides inspiration for or
impacts people’s expectations of
your product”. Mrs. Freeman gave the
following examples of analogous competitors for any products: online car insurance quotes that are offering more transparent price setting; Virgin America, who is now showing a funny video
on A320 safety at the beginning of flights to make it less boring and promote
attention.

In addition to this
fruitful, in my opinion, UX competitive intelligence framework, Beverly Freeman
quickly presented some other methods to include UX in competitive analysis. I
summarize those in the table below.

Usability add-ons

Add tasks for competitor's products to your existing studies.

Mental model
diagramming

Add a competitor layer
to a visualization of user’s thoughts, feelings, and behaviors.

the
4 elements of a UX model of value, usability, adoptability and desirability

“The
Golden Circle” where the “Why” (the producer’s beliefs) is an important
reason for the customer's adoption of a product or a brand.

Finally, Mrs. Freeman
talked about the traps one can fall into when doing UX competitive
intelligence. She talked about “starting with the wrong goal” by only trying to
identify other product’s mistakes, coupled with “being too competitor-focused”,
which is resulting in “fixing” your product by adding the competition’s features and benefits instead of using competitive intelligence to create real product
experience differentiators.

In summary, I found this
presentation enlightening, pointing out how we often analyze the UX of our products in a
vacuum. With her UX competitive intelligence framework and add-on analysis methods,
Beverly Freeman clearly demonstrated how a UX lens can expand one’s comparison
with competitors by looking at the holistic experience of a product. I
believe this approach can better enable UX practitioners to surpass the
competition by really innovating rather than only adding to the feature sets
and ease-of-operation of our products.