Developing Usable Online Information for a Web Authoring Tool

Here are the following topics:


Developing Our Checklists

We developed our first checklist in 1992 with over 250 questions and tested it with a number of user guides. Because this proved too long, we split the checklist into two separate checklists:

The Usability Index checklist consists of about 180 questions and evaluates the usability features, such as finding and understanding the information, while the Quality Assurance checklist has about 70 questions and tests production quality, such as ensuring that all trademarks are in alphabetical order.

When we considered the design of our checklists, we looked at two options: a series of questions that could be answered with a Yes or No, or a scoring system based on a set of guidelines.

Our checklists use the question and answer method because we believe it more accurately tracks continuous enhancements to a document over the life cycle of the product it describes. The question and answer method is also more consistent when only one evaluator tests a user guide at different times in the document development life cycle. A scoring system is open to interpretation by the evaluators and we often found a wide variance of scores among different evaluators.

The Ravden checklist consists of about 120 human-computer interface questions. Each question has a qualified answer "Always," "Most of the time," "Some of the time," or "Never." Although this method provides more accurate information, we found that it took too long for evaluators to decide which answer to choose. With our checklist a 100-page user guide can be evaluated in three hours. We also shortened the testing time for some of the categories by choosing and testing only five pages or two chapters chosen at random instead of the whole user guide. For the questions themselves, our strategy was to apply the concepts developed by Chignell, Mehlenbacher, and Nielson for software systems, as well as ideas from existing checklists and apply them to software user guides.

Applying the Research

Based on Chignell's findings, we included navigation strategies in our Usability Index checklist, including items on use of a contents page, headings, index, and glossary. These are the "book" metaphor access devices that users employ for both the online and printed user guides. The following list is an example of the questions for the Headings category that appears in the section on Finding Information.

Headings (Choose one chapter at random)

From the Mehlenbacher research, task orientation became a requirement for the Usability Index checklist. The following list is an example of the questions for the User-Oriented Tasks category that appears in the section on Supporting User Tasks.

User-Oriented Tasks

The Nielson research helped us realize the importance of grouping the many guidelines, while the Hartshorn checklist [4] provided the following five usability principles for testing software user guides:

These principles helped us organize our checklist questions. For example, the following list shows the type of questions for the Contents Page category that appears in the section on Finding Information:

Contents Page

Similar to the Usability Index checklist, the Quality Assurance checklist consists of a set of categories, including:

Usability Index and Quality Assurance Checklists

The Usability Index and Quality Assurance Checklists measure the usability index of a software user guide and are available on the Web. You can view the checklists in table format or you can download them in Microsoft Excel format and use them online or as printed forms. If you use the online version, Excel calculates the usability index based on your answers. You can also add and delete questions to better fit the requirements of your own user guides.

After applying the usability research to develop our checklists, the checklists were used to test the user guide for the Tapestry software.

Testing the First Tapestry User Guide

In 1996, we used the checklists to test the usability of a user guide for a software application called Tapestry. Tapestry is a WYSIWYG (what you see is what you get) application for creating and editing Web documents on Macintosh computers by dragging and dropping the contents of one document into another. Creating a Web document with Tapestry is easy because users don't have to learn Hypertext Markup Language (HTML).

A user guide is an online or printed book that describes how to use a software application, in this case, Tapestry. Good user guides contain task-oriented procedures for starting, using, and exiting from an application. They are distinguished by their clear style, concise writing, user-centered organization, and effective integration of text and graphics.

We used the Usability Index checklist to test these characteristics in the first release of the Tapestry User Guide. The usability index was 60 per cent because the guide was missing a glossary, index, description of some of the menu commands, and task-oriented instructional procedures.

Testing the Enhanced Tapestry User Guide

After analyzing the results from the first usability test we added the following enhancements to the user guide:

When we measured the enhanced Tapestry User Guide, the usability index had risen to 90 per cent.

Developing the Checklists from Academic and Industry Research

The Usability Index and Quality Assurance checklists were developed from academic and industry research and show how academic research can be applied to improve the quality of industry software products.

Information Development Triangle

Figure 1 shows a triangle of information development among academic researchers, industry product developers, and users of the Tapestry software tool.

Figure 1, Information development triangle

Figure 1, Information development triangle

In 1992, the idea for using a checklist to test the usability of a user guide was inspired by an STC seminar presented by Joan Cherry [1] from the University of Toronto. This seminar described how the Ravden [10] checklist was applied to test the user interface of an online information search system for a library. This initial contact led us to other academic research, such as Cherry [2], Chignell [3], and Mehlenbacher [8].

In summary, our checklist approach was:


[Contents] [Back to Related Research] [Forward to Usability of the Tapestry Software]