User Testing and Peer Review

User testing and peer review are ways of getting feedback on your documents by enlisting the help of others. Good documents are never composed in a vacuum, and instructions are no exception to this.

User testing, or usability testing, is where you try out your instructions with a potential real-life user. In doing this, you can determine whether the "best guesses" that you made in your user and task analyses were appropriate, and whether there are any significant design or content issues that need to be addressed. Usability testing almost always reveals unforeseen and often surprising problems with documents as they are used by their audience in context. In practice, there are many different kinds and ways of administering usability tests. Some are done in laboratories using one-way mirrors, cameras, intercoms, and strict empirical protocols. Others are done less formally in the context of the document's actual use.

A user test is meant to help you determine the ways in which your instructions fail to work rhetorically as you intended. It is not the role of the user to give you design advice or even to tell you how to fix the problems they demonstrate.

Peer review is an entirely different kind of activity, and generally should not be confused with user testing. Peer review also involves showing your work people, but those people are not necessarily members of the work's intended audience. Nonetheless, they should be able to provide helpful feedback or advice. Instead of an empirical protocol or research design, peer review usually involves very informal communication between the rhetror and the reviewer.

For this assignment—as is sadly common in the workplace—we don't have the resources to do a full-on usability test with selected members of the instructions' audience. Instead we are going to take a hybrid approach. For next class, you should develop a questionnaire that allows you to test various aspects of your documents, such as "its navigability," "its visual design," "the relevance of the information," "its clarity," etc.

Order the questionnaire so that the questions appearing first are about user testing issues: those things that can be evaluated by having an actual user perform them. The questions that follow should be about peer review issues: those things that you want advice or outside input on as the composer of the document.

Develop your user testing questions carefully by assessing their reliability (will the question get consistent results?) and validity (will the question actually get at the thing you're trying to get at?). And for the peer review questions, consider the quality of information that your question is likely to solicit. Write questions that invite deliberation and response; if your questions can be answered with a quick "yes" or a "no," they probably don't meet this criterion.

This questionnaire, along with a draft of the instructions, will be handed off to another section of 3120 for review.