Customer service -- it matters

(This article first appeared on IBM developerWorks.)

Peter Seebach ([mailto:seebs@plethora.net] seebs@plethora.net)

Freelance contributor

March 10, 2003

Column icon Usability testing on phone systems is a valuable lesson for Web developers. After all, happy customers help with happy business growth.

Everyone hates automated phone systems. They've gotten better, really, but to this day, they are often pinnacles of unusability. Many of the things that make phone systems frequently frustrating to use apply just as well to Web pages.

Service is a cost center

One of the reasons that phone systems are often difficult to use is that, in most companies, customer service is viewed primarily as a cost of doing business. Since it has no revenue stream, the focus is generally on minimizing expense. Managers, then, are compensated for successfully cutting costs and not necessarily for making customers happy.

To make it worse, it's very hard, and fairly expensive, to really measure customer satisfaction. Many phone systems use proxies to measure success; for instance, short hold times are generally seen as good, and short call times are seen as a good cost-cutting measure. This can create perverse incentives; when given an order that call times must be shortened, phone reps have been known to hang up on difficult calls, rather than resolve them. If the customer calls back, that's fine -- it's now two calls, and higher call volume means a need to expand the department, which is also just fine. End result? Customers are a lot less happy.

I talked to Peter Leppik, the founder of VocaLabs. VocaLabs is a company that does usability testing on call centers, especially automated phone systems. Their research shows that, after a positive experience with a call center, 88% more customers said they were likely to buy something from the company within the next year; of course, if the experience was negative, the numbers didn't look so good. Customer service is how you build relationships with customers; so if the service is bad, they likely won't be back.

There's only one thing worse than an economic model which harms the customer: an economic model which harms the customer and the company. In fact, a lot of the efforts made to reduce call center costs end up driving customers away. When I called Dell to inquire about a laptop, I got as far as explaining that my needs were a bit unusual, and I wanted to make sure I got a laptop that would fit them well. The sales rep I was talking to said "Well, I'm on commission, so try to make it fast." Needless to say, I didn't buy a Dell.

Reducing costs, reducing satisfaction

For a long time, it was generally understood that the best you could hope to do with an automated system was to reduce the volume of human-assisted calls by perhaps 35%, maybe as high as 50%. Because the automated system was seen only as a way to keep humans out of the loop, it was allowed to be awful.

As a result, many companies discovered that if they simply didn't allow you to connect to a human, they could keep costs even lower! Of course, perhaps they lose a few customers, but costs are down, and that's a start.

Newer results suggest that most users are willing to use an automated system. Furthermore, if that system actually works well, as many as 90% would rather use an automated system than talk to a person simply because, when they work at all, computers are typically more reliable and are never surly or impatient. So, if your phone system is getting a really high volume of human-assisted calls, the solution may be to fix the call system and not to try to prevent a frustrated user from talking to a real person.

Often companies are resistant to address these issues in an effort to keep costs down. In many cases, those costs are there anyway, in higher call volume or in departing customers.

Experienced users are the exception, not the rule

Most of us only call any given automated system a few times, and indeed, the majority of calls to most systems are first-time callers. Unfortunately, many testing methodologies revolve around having the same small group of people perform dozens of tests. By the time they've done a few tests, these users are more experienced with the system, more comfortable with it, and more familiar with its limitations than any typical user will be; this eliminates a great deal of the benefit of testing a system.

VocaLabs found an interesting way to address this. They have a large pool of people who can be asked to test a phone system. They call into a switchbox at VocaLabs, and their calls are connected to the target phone system where the call is recorded. After the call, they fill out a survey on the VocaLabs Web page. Hundreds of people test each system, and each call's recording is associated with the survey data. If a few people report a problem with a given part of the system, those calls can be listened to and evaluated.

As a result, they have very broad sample spaces, and can get detailed information about any given call when it's needed. This also helps catch infrequent problems. If your sample space is a dozen people, you could easily miss a problem that affects about 5% of callers, but losing 5% of a high-volume call center's callers could cost thousands of dollars a month.

The best time to test a system, of course, is before it's rolled out; testing should be a part of the design process whenever possible, and in particular, early prototypes need testing. Sometimes, this is done by what's called the Wizard of Oz method -- you have a person with a script who pretends to be the planned automated system. (The tester is supposed to ignore the man behind the curtain.) This can catch obvious logical flaws, such as a system that doesn't ask for an account number before trying to look up your account balance in a database; but it's much too expensive to do high-volume testing with, and, of course, provides very little testing of any speech recognition that might be used.

Applying this to Web sites

A lot of the lessons learned here apply well to Web sites. Many companies persist in thinking of a support Web site as a way of reducing the cost of providing support; unfortunately, these companies don't appreciate the potential to make customers happier.

Testing is sporadic and often rooted in bad assumptions -- most sites are only tested by people on the local Intranet, so loading speed isn't even being considered.

Just as with phone systems, people put a lot of effort into making it hard to communicate with real people and don't put much effort into making the Web site more useful. Too many sites give only a complicated (and often unusable) form to fill out, rather than providing, say, an email address and a phone number. Go ahead and provide the form. If it's a good form, some people will use it. But always remember that you need that safety valve; those who use it are precisely the ones who, when denied it, will be unhappy, and will rightfully blame you for making things worse.

Success, it turns out, comes more from quality than from flashy looks. Google's plain white interface that actually produces good search results turns out to be a much more attractive model than search engines that plaster the page with exciting graphics, menu after menu, and bury the search results inside frames and frames and more frames.

Most of all, remember that interaction with your customers is how you build a relationship with them. Make it a beneficial relationship.

This week's action item: Play around on the Web sites of a few companies you buy products from. How usable are they? How quickly do you find yourself needing to talk to someone because you can't get what you want from their Web site?

Resources

About the author

Photo of Peter Seebach Peter Seebach has been having trouble navigating through badly designed pages since before frames and JavaScript existed. He continues to believe that, some day, pages will be designed to be usable, rather than designed to look impressive. He can be reached at [file://localhost/home/seebs/ic/crankynew/crankyuser@seebs.plethora.net] crankyuser@seebs.plethora.net .