Sidebar: The Five Basic Interface Dos
You shipped! Congratulations!
You did everything by-the-book: spent downstream time planning the project, got client management sign-off, wrote a functional spec, coded like mad and made the rollout date. The app is solid, on time and on budget. The client oozes confidence.
It is only then that things start to go wrong.
The calls start… end-users want to know how to save a report. How to email it. How to request a second copy. Basic stuff. Are these people idiots? It’s right there on the first menu… and in Help. Can’t they even press F1?
Your voice mail is full. What about your schedule? You’ve got a week to do the spec for version two. You’d vow never to offer service/support on a client contract again, but clients would never go for it.
How can you avoid “schedule-suck” and lost money? (National averages say calls to your help desk cost you $32.00 apiece.)
Well… what if you brought users in from the start? Let them have a say in how your application gets designed?
“What?” you say, “Users can’t design applications!” Have you talked to these… these people? Didn’t you hear that horror story—that project where we involved end users in design? Indecision, fights, slipped schedules, cost overruns…
That’s right, of course. Giving users control over your design process would be counterproductive. End users generally don’t know how a finished application should be coded. That’s why they hired you.
Still, there are many things users know that you don’t. Wouldn’t it be great if you could actually work more closely with the users? You’d know:
- The ins and outs of their project workflow
- What “little things” you could add to make their jobs easier.
- What to do (and not to do) before coding, avoiding costly midstream changes
- What 20% of functionality will please 80% of your users and save you a lot of time coding stuff users don’t really care about
- Where the problem areas are so you can solve them—while they’re still solvable
You must involve users in your design process, gather the information you need, and still retain enough control to get your project out on time and on budget. How? Here’s a structured method that lets you guide and counsel your users through the design process.
I. Build a design team
To start, don’t leave the design process to the programmers alone… build an application design team using people from your company. Ideally, you want to include people of many different disciplines, each of whom complements each other, so you can cover most of the bases. You’ll need:
- A trainer/facilitator—a “people person” who can handle communicating with end users and orienting them in clear, non-technical language
- A technical writer to record things and assemble the spec
- A programmer who can say what can and can’t be coded—and to play the “computer…” you’ll see in a bit…
- An expert in application content and business use. Can be a full-fledged professional software designer, or a “power user” from inside your company.
The team has to be just big enough, but not so big it becomes an Olympian feat just to schedule a meeting. Four people is a minimum (facilitator, writer, programmer, content expert), eight an absolute maximum.
Meet with your team prior to project kick-off. Have lunch together and get a feel for how they work together. Over appetizers and drinks, give them an idea of the process you plan to use (this one!). Explain the benefits: reduced development cost, faster time to deployment, increased user acceptance and lower support costs. Seal the deal with a nice dessert (I hear the creme brulee is excellent).
Now it’s time to get down to work. As soon as possible, meet with your team and develop a list of product function questions to ask your end-users. Typical questions: “what tasks do you perform in your job?” “Do you use paper forms… if so, are there samples?” Write the questions down.
II. Conduct onsite interviews with users.
Now it’s time to visit your users where they work. You’ll need to schedule a meeting with all of them in a conference room, bring that list of product function questions, and write down every answer that they give you.
But you’ll do more than that. You’ll look around at each user’s workspace and note:
- Where is their computer located: On their desk? On a common table where six people share it? In a conference room? In their boss’ office?
- Do they have to switch from one application to another a lot? How many interruptions a day by peers, clients, ringing phones (remember?)?
- Do they know how to use a mouse, or are they typewriter jockeys who love to memorize keystroke chords that would make Brahms jealous?
- Do they live in their manuals, hoarding every doc page like a treasure map, or do the books gather dust on the shelves?
- How many hours a day do they use the computer?
- Is the computer a “microwave oven” or a “gourmet range” to them?
Write this all down too and develop a profile of each user.
After you start seeing repetition, summarize similar users and give these summaries names, like “Jim” and “Betty.” The names will help later on, when the team makes a case for a design or feature (Example: ‘Jim would love this Excel export function!’ ‘Yeah, but Betty wouldn’t care.’). Write these down on one sheet of paper and write “user profiles” on top of the sheet.
From the data you’ve gathered, develop and record a list of five to nine common tasks your users want to perform with your product.
Task list rule of thumb: Yes, “nine” sounds like a tough upward limit, but after a certain point, the total number of tasks can cause “feature crash,” where you’ll find it hard to make your design specific to any task. Cognitive designers say there are hard limits to human perception and short-term memory. The maximum number of discrete units the average person can retain is “Seven, plus or minus two.” That’s a max of nine and a minimum of five units. So if the number of tasks exceeds ten, you might want to reframe your design into smaller chunks. Example: If your application will have data entry and reporting functions and that puts you over ten tasks, split the design effort into a “data entry” chunk and a “printing” chunk. Each chunk can have up to nine associated tasks. Later on, you can unify the chunks.
Word the tasks in non-technical language. Now commit them to a single sheet of paper and write “Task List” on top.
You have now completed two deliverables for this step: User profile and task list. Put them in your design project file and let’s move on.
III. Crank out low-tech prototypes fast (with paper!)
The common software prototyping buzz says, “don’t write your prototypes in C++. Use an easier, higher-level language, like Visual Basic or Delphi to crank them out quickly.” But there are costs associated even with these “cheaper” prototyping methods:
- VB and Delphi coding still require work, so designers get quickly attached to their design investments. After a doing the grunt work, some designers start to resent constant changes to their code. They could end up defending their work, rejecting even minor suggestions.
- If you put a programmer into a development environment, no matter how simplified, they will find it hard to resist the temptation to “app-smith,” perfecting every dialog box and painstakingly lining up buttons. They might even add unasked-for and unneeded features and try to debug the prototype so it never crashes.
- Sometimes a middle manager likes the prototype so much, they say, “Great! Ship it tomorrow.” And so you end up spending the next four years supporting a product that was whacked out in two days for a quick prototype.
This will be a prototype, not an application. The basic idea is to:
- Be able to prototype and make changes quickly, so you can respond to user feedback from usability testing (see the next section for more on this)
- Develop prototypes you can feel OK about throwing away
- Make a prototype nobody (especially those pesky middle managers) will confuse with an actual product
That’s why you should use paper “low-tech” prototypes and stay away from the computer.
The materials are simple: Paper, colored pens, acetate sheets (which can provide a wipe-off surface for mocked-up dialogs when users have to “type data in” during testing), glue pens, adhesive tape. Sure, you can add sophistication if you want… some users of this method draw dialogs, windows, buttons and list boxes in a high-end drawing program, making them look as realistic as possible. But for your purposes, “crudely drawn” can also be effective; the raw, immediate look can communicate to your testers exactly what’s being accomplished: a “sketch” to use for evaluating your system design.
IV. Design with users in mind
You’ve got your user profiles and task list from step II, and your low-tech prototyping kit from step III. Now you’re at the “blank sheet of paper” step. How to fill that in with a usable design?
- Brainstorm with your team to find a central, consistent “metaphor” for how your system will look and feel. (To “brainstorm” means that everyone can contribute ideas and nobody can refine any of them, argue or shoot any down).
Your system’s metaphor can come from real-world objects—(remember the “desktop” metaphor used originally by the Xerox Star and Macintosh?). You can use a tool, document, or other object as the basis for your metaphor. A metaphor can come from:
- The real world (a “telescope” can let you see data from far away, a “keypad” as found on a telephone can provide a way to let users punch in data quickly, an “assembly line” can process information using a series of discrete “machines,” etc.).
- The world of computers (e.g., a “spreadsheet” metaphor, a “control panel,” etc.)
It’s best to draw metaphors from objects your users actually work with and tasks they perform—with or without a computer—in their workday.
Write the metaphors down on a whiteboard.
- Try to sketch out how each metaphor would basically work with the data. Don’t take it too far—maybe five minutes per metaphor. Examine how the metaphor would actually work. It’s okay to hit dead ends and change direction. That’s what this design phase is for.
- Narrow the field to two or so metaphors that seem particularly strong. Split your group up into two teams, and spend an hour sketching out how each system would look and feel. Now pick one. But keep the other on the back burner—you might decide you prefer it later.
- Whip out your design kits and start to cut, draw and paste your systems. Think about the points where users will have to interact with the system, how the system will indicate what the user should do, and how it will respond at each point. Start to script these out. Pick someone from your group to play a “computer” and another to play one or more of the users from your profiles. Rehearse their interactions.
V. Usability test on $5.00 a day
Now it’s time to bring users back into the loop.
When they hear “usability test,” many managers go blind. They’ve heard about how much Microsoft spent on their Windows 95 usability lab… high-tech video setups, sophisticated GSR measurement systems, one-way mirrors, sound baffling so reviewers could watch without disturbing testers, etc. And Microsoft brought in thousands of users… with a total bill that ended up in the millions.
But really, to set up your “usability lab,” you don’t need anything more than a quiet, private room in which to run and monitor your tests. Maybe a video camera if you want videotaped results; but even this is optional.
And if you cannot test with a hundred users, test with ten… three… even one. The basic idea is that any test is better than none. Remember, you don’t want to get too far away from users’ needs at any point—that is, unless you’ve grown attached to the sound of a ringing phone.
To prepare for the test:
- Pull out the list of the users you interviewed and ask for additional candidates from your marketing department or sales force. Contact the clients and schedule the tests; let them in on the exciting news that they’ve been selected to help you design the new application’s interface.
- The team should write up a scenario for using your product that makes sense in the prospective testers’ business. (Example: “You are a sales assistant for the XYZ corporation; you provide field support for the sales force, enter orders, etc….”)
- Zoom in on key tasks to be performed under the scenario, map them to your task list, and write up a test script that incorporates these tasks. Provide sample data—as real world as possible—for the test. Provide no information about how to accomplish the tasks, only what tasks need to be done. Provide any data that would need to be entered (better to take care of this for them so they do not get distracted by having to think up data).
- Determine ahead of time how you’ll quantify the test. Will you measure the time it takes the user to complete each task? Will you count good vs. negative comments? Will you create a usability ratings system?
- Rehearse the test using your new script with someone from your company who hasn’t already seen your prototype. Try to knock as many kinks out of the prototype (and the test script) as possible.
- If your system needs to display error messages or prompts, make sure you design these ahead of time. There probably won’t be time to do this during an actual test.
When running the tests, you’ll need three people to work with the user and prototype:
- A “facilitator,” who can put the user at ease, explain the parameters of the test, and basically monitor the status of the test. Your team’s “people-person” is the best fit for this. This is the only speaking role in the test; all other team participants should just watch.
- A “computer,” who will manipulate the paper prototype in response to user input. The computer should not talk at all; they can optionally “beep” if the user makes an error (if that’s not too silly). Usually the team’s programmer takes on this task.
- A “scribe,” who will record at each test step (1) what the user did (2) what the user was trying to do (3) an analysis of any stumbling blocks found. The team’s technical writer is a natural for this job.
Here’s how a typical test should go:
- Bring your users in one by one. Greet them in the lobby and lead them to the test room. Offer them some juice or coffee. Make them feel comfortable and at home.
- The facilitator should introduce each test user with a little “spiel” that explains the following:
- You’re testing a prototype of a new system
- That any response the user makes is valid; there are no “wrong answers”
- That what’s being tested is the prototype; not the user! (Stress this one!)
- The scenario for product use (“you are a sales assistant…”)
- Any additional needed materials (e.g., draft documentation, a telephone if you want to simulate phone support, etc.)
- Offer to answer any questions they have at this point. And also note there will be another “Q & A session” after they finish the test.
- Hand them the printed list of test tasks and scenarios and get started with the test.
- As your test progresses, you will definitely notice areas where your product will need work. Don’t let that discourage you! You’re trying to find the problem areas in the design. The facilitator should let the user struggle just to the point of frustration, but not beyond.
In some cases, this can be somewhat painful to watch for the designers as well as the test users. Designers might want to call out hints to the test user. Try to avoid this. It’s up to the facilitator to keep the user focused on the product for as long as is needed. If the user starts to get so frustrated it would stop the test, it’s OK for the facilitator to stop that test step and give the user hints to keep things moving. But we’re not trying to teach the user how to use our prototype system. We’re trying to find the stumbling blocks now, when we can still make changes easily (it’s only paper).
- When the test is done, ask the user, “How did you find the experience?” “Is it hard or easy to use this product in general?” “What areas of product function are easiest or hardest to grasp?”
VI. Iterate your design
Between each test, have a quick meeting with the design team to discuss your “gut feelings” about the test. Decide if there are any areas you’d change right now. Then get out the scissors and tape and change them! Bring the changed version to the next test and retest your changes. Re-work again if needed.
The idea is to quickly refine the product design between tests. That’s how you maximize the “bang for the buck” inherent in this process.
VII. Where to go from here
Beyond this point, you move into the realm of actually coding your application. It’s important to use all the information you’ve gathered to this point. Most commonly, the “scribe” who wrote up the tests then goes on to write up a complete specification for the product’s look, feel and function. You want to be very specific about every aspect of the application. How does it behave? How does it respond? How to cue the users what to do next? How to report errors? How to show successes?
Once parts of your application are coded and can run fairly well, it’s probably also a good idea to re-test the “high-tech” version with a new set of users (make sure they fit the profile you first developed). At this point you want to confirm your previous results.
This article has dealt very generally with a design workflow you can use for applications. Following you’ll find a few great books you can use to take your design journey even further.
Basic design scenario work and cognitive issues
- Carrol, John; Scenario-Based Design, New York, John Wiley and Sons (ISBN 0-471-07659-7)
- Booth, Paul; An Introduction to Human-Computer Interaction; 1989, New York, Psychology Press (ISBN 0-86377-123-8)
- Zetie, Carl; Practical User Interface Design: Making GUIs Work, 1995, New York, McGraw-Hill (ISBN 0-07-709167-1)
- Tognazzini, Bruce; Tog on Interface, 1993, New York, Addison-Wesley (ISBN 0-201-60842-1)
- Nielsen, Jakob; Usability Engineering, 1994, Academic Press (ISBN 0-12-518405-0)
A style guide (good for building your paper prototype kits)
- Fowler, Susan and Victor Stanwick; Gui Design Handbook; 1997; New York, McGraw Hill (ISBN 0-12-263590-6)
- Rubin, Jeffrey; Handbook of Usability Testing; New York, 1994, John Wiley and Sons (ISBN: 0471594032)
- Dumas, J., Redish, J. (1993) A practical guide to usability testing. Norwood, NJ: Ablex (ISBN: 089391990X)
Good to give your manager if s/he says “usability is too expensive”
- Bias, Randolph G. and Mayhew, Deborah J. Cost-Justifying Usability, New York, Academic Press (ISBN: 0120958104)
Theory (if you’re so inclined…)
- Laurel, Brenda; (1993) Computers as theatre. Wokingham, UK: Addison-Wesley (ISBN: 0201550601)
- Laurel, Brenda; The Art of Human-Computer Interface Design, New York, Addison-Wesley (ISBN 0-201-51797-3)
- Norman D. A.; The Psychology of Everyday Things; New York, Currency/Doubleday (ISBN: 0385267746)
- Norman D. A.; Things That Make Us Smart: Defending Human Attributes in the Age of the Machine; New York, Addison-Wesley (ISBN: 0201626950)