The User Experience of Enterprise Software Matters

“As important as the user experience of enterprise software is to a business’s success, why isn’t its assessment usually a factor in technology selection?”

Over the past twenty years, the field of user experience has been fortunate. Software and hardware product organizations increasingly have adopted user-centered design methods such as contextual user research, usability testing, and iterative interaction design. In large part, this has occurred because the market has demanded it. More than ever, good interaction design and high usability are part of the price of entry to markets.

However, there’s one area that I believe has lagged behind: the enterprise software space. I can’t tell you how many frustratingly unusable enterprise Web applications I’ve encountered during my 12 plus years in corporate America. As important as the user experience of enterprise software is to a business’s success, why isn’t its assessment usually a factor in technology selection?

Just as the mass market has demanded and is receiving more usable products, so should businesses demand that their technology vendors make their software easier to learn, more efficient to use, and easy to remember. But for a variety of reasons, many organizations don’t even know how to make this demand.

Consider this column a call to action to organizations that buy enterprise-level software. Here’s what I have to say to them:

Your technology selection processes are incomplete. You’re not assessing the usability of the technology you buy. You’re not only incurring huge hidden costs because of this failure to assess usability, you’re letting enterprise technology vendors get away with building products with poor usability.

The rest of this column explains why this happens and what enterprise technology purchasers can do about it.

Enterprise Software

“Enterprise software products are complex, powerful tools. Their complexity is one of the reasons businesses sometimes fail to fully realize the expected return on investment from these products.”

Enterprise software products are complex, powerful tools. Their complexity is one of the reasons businesses sometimes fail to fully realize the expected return on investment from these products.

For enterprise employees, who must use these enterprise applications, this complexity poses a considerable challenge. When an organization deploys an application, it expects users to learn the new system, integrate it into their existing work processes, and become proficient enough to allow the organization to realize the system’s full benefits. Far too often, however, enterprise employees find these new systems hard to learn, hard to master, and difficult to integrate into existing processes.

Enterprise software, which broadly encompasses functions such as enterprise resource planning and management, customer relationship management, supply chain management, network management, project portfolio management, and business intelligence, is a multi-billion-dollar-per-year industry. Well-known vendors include BMC, Oracle, SAP, Siebel, and telecommunications equipment manufacturers such as Nortel and Cisco, to name a few.

Most Fortune 500 companies have multiple enterprise software products installed, and many mid-sized business are either actively considering or have implemented enterprise solutions. As the market has matured and vendors have searched for new growth opportunities, enterprise software developers have made solutions available to even small businesses with fewer than 100 people.

To a growing company, enterprise software promises to convey benefits in a variety of areas, for example:

Centralizing customer information from sales, marketing, customer service, and support to improve customer service and enable better prospect identification.

Configuring and maintaining the performance of network resources, managing and recovering from network faults, managing network traffic for billing and accounting purposes, and ensuring the security of the network.

This is complex stuff.

Consequences of Poor Usability

From the CTO’s and IT Director’s perspectives, these promises assume the internal user groups can and will learn the new systems and incorporate them into their work processes. But these outcomes are far from assured. Some of the problems and pitfalls businesses encounter include the following:

Some businesses find that their employees’ productivity actually decreases, because common or critical processes take longer using the new application.

Others fail to realize an application’s benefits, because users vote with their fingers and don’t adopt the new system.

“If a business mandates process changes and deploys systems users perceive as difficult to learn, use, and remember, the user population will see it as a change for the worse and resist.”

Businesses can also experience reduced employee morale and increased turnover as a result of the imposition of new systems and processes. There will always be some employees who resist change in any form. However, if a business mandates process changes and deploys systems users perceive as difficult to learn, use, and remember, the user population will see it as a change for the worse and resist. In such a situation, employee morale declines, and those who are sufficiently disgruntled may even leave if other opportunities present themselves.

IT (Information Technology) organizations responsible for supporting an enterprise application can find themselves overwhelmed as they struggle under the unexpectedly high numbers of support requests that often accompany an application rollout. As anyone who has worked a help desk knows, rollout day for a complex application often seems like a perfect storm for level-one support staff.

Why do such scenarios play out in organization after organization? I argue that two factors drive these kinds of outcomes:

Enterprises don’t hold their vendors to high enough standards for application learnability, usability, and efficiency.

To illustrate the real-world costs of these two failures, I’ll relate two case studies. These are true stories. They happened in organizations I’ve worked for—though I’ve obscured their identities to protect the guilty.

Case 1: The Business Intelligence System

The Technical Support group for a major financial software application was responsible for generating weekly and monthly reports on calls to their help desk. The reports summarized statistics for management, including call burden, call reasons, call resolution time, hold time, post-call work time, and other statistics that let them track costs and the productivity of support representatives. The process of producing the reports was mostly manual. The data resided in three separate systems, requiring data extraction through complex—though usually repetitive—queries, and they generated and formatted reports using a spreadsheet application.

Other groups within the organization—including Product Management, Development, User-Centered Design, and Quality Assurance—frequently requested custom reports and raw data from this amalgam of systems. Trying to deliver their periodic reports and comply with outside requests sometimes overwhelmed the people in the Technical Support and IT groups.

The business decided to deploy an enterprise-level application that would enable Technical Support to centralize the data, automate data retrieval and report production, and provide outside groups with self-service capabilities that would let them meet their own data and reporting needs.

Upon deploying the application, the business discovered that it took five to ten times as long for the Support and IT staffs to extract the data and produce the reports—even after training had made them proficient in the application’s use. Both groups discovered that the Web application’s user interface comprised a reporting wizard that required users to drag and drop date ranges, field names, and other delimiters into a form. Complex queries that had previously taken five to ten seconds to type now took two or three minutes to drag, drop, delimit, and run. The application forced users—who had considerable technical abilities and expert-level knowledge—to interact with the system as if they were neophytes.

Furthermore, the application output the reports as static HTML that users could not reformat, rotate, or otherwise adjust. Although the simplistic process of query-building proved easier for occasional users from outside groups, their inability to manipulate the output proved frustrating. As a result, the Support and IT staffs were again forced into a bottleneck role, even more laboriously creating reports to comply with outside requests. Within six months, the Technical Support group had brought their old systems back online and reverted to their previous process.

Case 2: The Expense Reporting System

A large telecommunications equipment manufacturer decided to move from spreadsheet-based expense reporting to a system that let users input expense information directly into the company’s accounting system. The application promised to eliminate manual steps—including double entry of data—remove data-entry bottlenecks, and streamline the accounting process.

Employees at this company had an inkling that the new system might pose difficulties when, two weeks prior to the system rollout date, HR (Human Resources) disseminated a 50-slide training presentation to all employees. Next, all employees found out they must complete a mandatory hour-and-a-half-long desktop-video training session in the use of the new system that HR and IT had developed jointly.

The productivity lost to training employees to use the new system was a significant expense, as were the projects that were necessary to produce the training materials. However, the loss in productivity and expense the organization suffered when the system actually went live dwarfed the up-front training costs.

Everything about the application’s user experience was problematic. The process employees followed to enter, describe, and categorize expenses was confusing, unnecessarily long, and ill thought out. The data-entry screens were poorly designed. The terminology the application used, while familiar to finance and accounting professionals, was opaque and unclear to most other employees. The application presented information in illogical formats. For example, it forced users to scroll through 200-item drop-down lists of accounting categories and cost centers, whose order made sense only to those in the Finance department. Users could not resort lists in the user interface into alphabetical or numerical order.

Successfully submitting an expense report, which had previously taken only a few minutes, was now a half-hour undertaking fraught with error and frustration. As a result, productivity and morale suffered. Worse, compliance waned and systemic errors propagated through the accounting system. Some employees simply stopped expensing small purchases or assigned expenses to accounts that appeared near the top of long account lists.

I’m sure my experiences are not unique, and these types of fiascos occur in enterprises worldwide. Why? Let’s explore the dynamics of technology selection.

The Vendor’s Lament: If You Build It, They Will Complain

“Many enterprise software development processes don’t adequately incorporate users’ specific wants and needs—at best, waiting to consider them until it is too late in the development process; at worst, failing to do so at all.”

With their page-spanning feature matrixes, long lists of supported platforms and databases, ROI calculators, and downloadable case studies, enterprise application providers make fantastic promises. However, many enterprise software development processes don’t adequately incorporate users’ specific wants and needs—at best, waiting to consider them until it is too late in the development process; at worst, failing to do so at all.

So why aren’t users’ wants and needs considered earlier in the development lifecycle? There are many reasons, but I can boil them down to this short list:

Vendors build applications to satisfy their own perceptions of users’ needs, not users’ actual needs.

Engineering groups own too much responsibility for user interface design.

Featuritis makes applications unnecessarily complex.

The Vendor Is Not the User

“Product Managers, Requirements Analysts, and Engineers make assumptions about users instead of observing users and asking them about their wants and needs.”

Often, vendors build applications without incorporating the perspectives of actual user groups. Product Managers, Requirements Analysts, and Engineers make assumptions about users instead of observing users and asking them about their wants and needs.

Gathering information about users’ wants and needs is not difficult, but it’s important to do it right. However, it’s easy to do user research poorly or incompletely, resulting in a biased or incomplete perspective of users’ requirements.

The key to developing an accurate picture of user needs is to distinguish the main user groups—and how they differ from one another—then identify the users’ skills, tasks, and needs according to the roles they assume while using the application. It’s also helpful to test the usability of conceptual prototypes with actual users from the target customer groups. In this way, you can test early concepts and iterate designs very inexpensively.

Engineers Are Not the Users

Many development organizations give engineers responsibility for transforming requirements into user interactions, process flows, and screen designs. What results is a user interface that reflects engineers’ mental models. Unfortunately, their mental models for how things work differ drastically from those of users. Consider this example:

The engineer’s perspective—It’s a state-persistent container for database objects that requires authentication and setting cookies.

The user’s mental model—It’s a shopping cart.

Though both valid, these mental models are very different, indeed. And what makes sense to the engineer would be completely incomprehensible to the user.

Featuritis: The Bane of Users

“Featuritis is a pernicious malady.”

Featuritis is a pernicious malady. Both vendors and purchasers contribute to the persistence of this disorder. Here’s an example of what typically happens on the vendor side of the equation:

Competitor A has these five features and competitor B has these ten other features, so we’d better put them all in our next release.

This kitchen sink approach to product definition leads to a mishmash of features, with no organizing principle or overarching information architecture.

Technology Selection in Enterprises

“Organizations could avoid many a rollout disaster simply by testing the usability of vendors’ solutions with employees during a trial phase.”

Purchasers buy applications with poor usability, because they don’t know any better. The evaluation and decision-making process for the purchase of enterprise applications usually looks like this:

An organization identifies the need for a better, more scalable, or faster process.

Management makes the business case for deploying a new application.

The IS (Information Systems) organization sets technical and feature requirements that are often informed—in a somewhat circular fashion—by vendors’ application feature lists.

Purchasing solicits vendors—sometimes, asking them to respond to a Request for Proposal, or RFP.

Purchasing evaluates vendors on the basis of their responses and generates a short list of possible vendors.

IT often brings vendors’ systems into the organization’s test labs for performance and technical trials.

The organization selects a vendor.

IT undertakes the deployment of the new application.

This decision-making process typically neglects methods of evaluating the goodness of fit between the enterprise users’ processes, wants, and needs and vendors’ solutions. Organizations could avoid many a rollout disaster simply by testing the usability of vendors’ solutions with employees during a trial phase.

Enterprise customers can test the usability of vendors’ applications to ensure the IT investments they make deliver fully on their value propositions.

In my next column, I’ll describe how organizations can better assess and establish the usability of the enterprise applications they’re considering and, armed with this information, push technology vendors to develop more usable enterprise products.

12 Comments

Another reason for the poor design of enterprise applications is that they are proprietary, so designers can’t view other enterprise applications to get good ideas. Enterprise apps also inappropriately use ideas from commercial sites, which aren’t entirely appropriate for the corporate world. Enterprise applications will always lag behind commercial Web sites for this reason.

@Wyndham: I am pretty sure that the 2nd edition of “Cost-Justifying Usability” has a few case studies on enterprise software. Not having acquired the second edition yet—my apologies to Randolph Bias—I can’t say for sure. Maybe peek at the TOC via Amazon’s look inside the book feature.

With that said, however, I have never been less sure that pure ROI calculations really convince the executive decision-makers. At the most, ROI numbers can support a good success story narrative. But they don’t really speak to people as much as stories do.

John Rhodes, Dan Szuc, and I have been saying as much in our tutorial about how to sell usability within organizations. And it was also the reason I gathered together the authors for the book I edited, Usability Success Stories. Stories are just more persuasive than ROI calculations.

When you tell a good story, the people you’re trying to influence tend to put themselves in the place of the organization you’re describing and begin to imagine a different future for themselves. I don’t know if that makes sense, exactly, but I’ll put it out there anyway…

/steps off soapbox/

@Julianne: good point about enterprise apps inappropriately appropriating design patterns from Web sites. True that; seen it myself.

Our company makes a help desk application called HelpSpot. We sell primarily to mid-level companies and departments, but also to education and some enterprise. Unlike many enterprise apps, it’s easy to try our software. We’ll even host it.

One of my biggest problems with our UI development though is the almost complete lack of quality material on enterprise/B2B design principles for the Web-based world. Almost everything about Web-based UI is for consumer sites. The stuff for B2B is all very old and based on networked or desktop app design and is not very relevant.
I think the biggest area of neglect is in good information on data density. Most consumer apps have little hard data they need to convey to the user, they throw up a big form with 5 questions and 7 rows of data and they’re done.
In our case customers may be looking at hundreds, thousands or hundreds of thousands of pieces of data. Practically no information is out there on modern UI techniques for dealing with displaying this level of information.
It’s certainly possible I’ve missed some of this information, if you have any good sources I’d love to see them.

I’m the only full-time designer on an enterprise Web application. Julianne is right—one reason apps like mine are subpar in terms of usability is because our competition is hidden. Like recently, I discovered an open source app that does the same thing as ours. But I’d have to sign up and download it to a Web server and install it, and get phone calls from them, and so on. It’s not a site you can just go to and analyze any time.

I think another big reason is that, in my company anyway, the back-end developers—whether it be php, Java, or DotNet—rule the roost. They are simply given more power in the process. UI is given less, because an argument can always be made about how few people will really use this. Or how little it will get used.

A third big issue is client requests. For a Web app open to the public, you don’t have to listen to anyone unless they are in a majority. And even then, it’s not mandatory. But with an enterprise Web app, you must do what your client wishes. Of course, there is analysis about what they really want and hardly ever does something get built exactly how the client requested. But features are definitely built per a client’s requests, and that might not be what the next client wants, but how are we to know that? And if the client who asked for it needs it done quick, what are we to do?

My point is that it’s never as easy as these types of articles make it sound. Not that these articles aren’t valuable, but rarely does a real situation fit well into any advice on this subject.

@Ian: I agree with you. It feels like there’s a dearth of information about how to design enterprise Web apps. And I do see public Internet patterns inappropriately applied in the enterprise.

@Stacia: Good points. And I’m not unsympathetic to the vendors’ side; I’ve been there myself. Oftentimes the biggest client makes the biggest noise, so everything gets built according to their wishes. And what then? Clients B, C, D, and so on don’t always have exactly the same workflow, so the interaction design ends up being nonoptimal for them.

You’ve hinted at this, but to clarify: The single biggest problem with enterprise software purchasing is that the people who make the purchasing decisions are typically not the people who have to use—or even support—the purchased software day in and day out.

I worked for several years in an enterprise IT organization and saw this constantly: New systems would be rolled out based on some mandate from Finance, or HR, or Legal, or Corporate Communications, and it would become clear within a few days that any user or first-level IT support technician—had they been allowed to evaluate the solution—would have rejected it out of hand as a usability and support disaster.

A particularly great example of this was our company intranet. The existing intranet ran on Microsoft Content Management Server. It generally worked fine for posting basic employee content, but had some limitations—like poor search functionality, poor accessibility from worldwide offices, and a mediocre content editor.

Following a corporate reorganization, HR and Corporate Communications decided to replace the intranet with something better. They started off with good intentions, holding a series of employee focus groups to see what people wanted out of a new and improved intranet. However, the focus groups were strictly high-level, general brainstorming: we want better search, we want to be able to post forms, we want customized content per location or department, and so on.

HR and CorpComm then went away and, six months to a year later, came back not with a proposal to evaluate or a few options, but a final decision: Here’s your new intranet. Blam. It appeared that, instead of doing any serious evaluation of focus group requests against the options, they had picked a vendor (Oracle) that we already had a large contract with. I suspect an Oracle sales or account rep assured someone—with the typical sales wave of the hand—that “Oh yeah sure, it can do all that stuff.”

In reality, the search engine sucked even worse, requiring manually entered keywords on all content to return anything meaningful. The editing experience, while a bit more flexible, was much more complex and resulted in terrible-looking output unless you manually tweaked the HTML source code. It was impossible to design and deploy a form without back-end administrator intervention and custom coding. Instead of automatic login using your domain credentials like the old intranet, the new site required a separate Oracle login that our IT department didn’t have reset authority for, because it touched finance systems. It was slow and suffered frequent outages. It was a disaster from an IT support perspective, going from two servers for the old intranet to over 15 for the new one.

Corporate Communications helpfully migrated all the content from the old intranet to the new one, except that they decided to reorganize it in the process and put a lot of it in the wrong place. The migration process also broke most of the old layout and formatting. They tried to train content editors in each department, but I don’t think most of them ever really had the time or stamina to figure the thing out and sort out the mangled content that had been dumped in their section.

When most users were reluctant to adopt this new beast, instead of asking why and reevaluating, they decided to force it down everyone’s throats. Corporate Communications made the IT department—against our will—push out a system setting to all company computers that would force the browser home page to the new intranet. (Guess who got to field all the complaint calls?)

When I left that company a year and a half after the rollout, the old intranet was still operational and a different team, led by IT, was trying to investigate alternatives for moving forward. Big, expensive failure. Now, Corporate Communications at this company was an exceptional model of unaccountability and incompetence. (They once famously distributed another company’s email address, instead of their own, for employee questions, and then demanded that IT find a way to reroute all internet email sent to that address rather than send a correction admitting their mistake.) But I have a feeling this type of thing is common. If average employees and front-line IT staff had been given a say in the evaluation process, instead of just bean-counters and corporate policy wonks, the outcome might have been different.

@Stacia: I’m in a similar situation. I’m redesigning a suite of enterprise apps and systems for a small company. As I read this article, I mentally replaced vendor with development and found it resonated. The main difference here is, rather than building a system for my company’s customers, we’re building one for our own employees. I find that, designing for in-house users versus customers, there is a disparity in the level of satisfaction due to and the burden of responsibility to compensate for a user’s poor experience. Upper management is most concerned with the functionality they want and getting it in production quickly and cheaply instead of building a good product.

I totally agree that many UX articles and books touch on processes that don’t seem to scale down to situations like ours, where there are not dedicated UX, IxD, IA, or even research teams. As I’m sure you and many others do, I try to take what I can and apply it solo.

Thanks, Paul, for pointing us toward an area that doesn’t often receive the attention it deserves.

I presented a user-centered methodology for how organizations can better assess and establish the usability of their enterprise applications during the procurement stage at UPA 2009. I’d be happy to share all the materials if people are interested. (Unfortunately, the UPA site seems to have dropped some.) Myself and two others have also written an article on this topic that I believe may be published in UX Magazine in early 2010.

For a company to have bought enterprise software that is not user friendly or can’t be used by employees to its full potential is a wasted resource, not to mention a waste of money. Unfortunately, it’s still quite common to find systems being developed and sold where very little consideration, if any, has been given to the user. Hence the buyer of this kind of software really needs to pick a vendor that offers training on their products and proper after-sales service.

“The single biggest problem with enterprise software purchasing is that the people who make the purchasing decisions are typically not the people who have to use [it].”

Having worked on the vendor side for many years, I absolutely support Ryan. It is really the core reason for the UX issues in the enterprise area. If the UX is not a main criteria for a buying decision, why should I, as a vendor, invest in it? Instead, I put my efforts into full RfX compliance, ROI calculations, and great PowerPoint slides. As a vendor, I do what the customer wants. And the customer is not an end-user.

Even worse, many executives on the vendor side do not know how to use the application and, consequently, do not care about UX. Believe it or not, some have not even seen the UI of the applications they are responsible for. The effect is that there is no usability / product excellence corporate culture. In my experience, large vendors do not work like that: When there is a new release, management should be looking forward to using the system, trying out new features, and giving feedback. Instead, they check milestones, cost compliance, and how many customers have pre-ordered it.

How can we change this?

Two action points for enterprise software purchasers such as IT or sourcing:

Next time you see a vendor, ask the vendor’s senior managers to demonstrate the product to you. I do not mean PowerPoints or nice talks. I mean: ask them to execute some main use cases in front of you. And the manager personally, not the vendor’s techie / pre-sales specialist.

Include a small usability evaluation in the RfX / purchasing. In my experience, even some simple evaluation—for example, having five people execute the ten main use cases—gives a good basis for making the purchase a success, maybe a better basis than a 20-page RfX Excel sheet.