Web Design and Cognitive Clinical Interview
With standard usability tests, investigators are usually able to answer "how" performance is executed. For example, investigators record a user's flow of behavior. However, the question of "why" is often left only indirectly answered (e.g. by think aloud approaches) which can lead to faulty interpretations of data. The cognitive clinical interview can be used to attain direct answers to both how and why users perform tasks as they do. Originally implemented to better understand children's mental functioning, the cognitive clinical interview is now seen as a tool for usability engineering. It can be used to compliment other usability tools by conducting the interview after standard usability data is collected and can be interwoven into other usability tools such as expert reviews, focus groups, and task analysis. In the age of technology, we experience constant change in presentation of material and access to material that extends from television and telephone operation to library research to e-mail and web site performance. Therefore, we must continue to examine how people interact with information to make the transition to different presentations of information easier to parse and to use.
There are several components to the cognitive clinical interview. Similar to usability evaluations, the investigator plans tasks and hands-on materials to structure the interview. The investigator also prepares so that he or she is able to "follow" the user's lead. In other words, while there are standard materials, the investigator also follows up on points that the user makes that may not have been a part of the original plan but lead to deeper understanding of the original tasks. The investigator hypothesizes and tests hypotheses about what the user is thinking about relative to the tasks and perceived ease-of-use. The investigator uses probing questions such as "How do you know" and "Why?" to reduce the number of assumptions the investigator makes about the users' thinking processes. Formulating and testing hypotheses through probing questions is what stands the cognitive clinical interview apart from other interview formats. The following section exemplifies these cognitive clinical interview components.
How to use the Cognitive Clinical Interview
The interview focused on navigation of a store web site ("Store A"). Before the interview began, several tasks were devised to facilitate discussion. We begin with a task where John is asked to find the nearest store to a location. It seemed like a task he would succeed at quickly. This task was intended to ease the user into the interview.
LOCATING A STORE
Investigator: Let's begin by exploring Store A's web site and trying to find out where the nearest Store A is to you. Here is your address (John is given a piece of paper with an address.). You started out by going to "store locator?"
Participant: Yes. If it wasn't there I'd go to company info (John reads by following the mouse across the screen. He fills in all of the address information except for Country, which is the only field that has an asterisk signifying that it must be completed. John is sent to an error page and uses that back button to return to the previous page).
Investigator: Did that surprise you when that happened?
Participant: Yes, I realized I didn't select country. That's why I went tst. (John enters all of the data and is then brought to a page that asks if this is the correct address before getting to the nearest location page.)
Investigator: What were you expecting to see here?
Participant: I was expecting to see the closest [Store A] to Holmdel. I didn't expect this screen to come up and tell me to go to another window. (John is shown a list of only one address, the one he entered, and is prompted to pick the correct address by the instructions. John clicks the address link.) Okay I found it.
Already several important bits of information are presented. First, John is comfortable exploring the site. He finds the button labeled, "Store Locator" quickly. He had a back-up plan if this route didn't work. Next, when John received an error message, he made a tst sound indicating discontent. To test the hypothesis that this was not what John had expected to see, the investigator probed John asking him if he was surprised by the outcome. John confirmed the hypothesis. When John was surprised a second time, the investigator asked another probing question to identify what John expected to happen. Within the first minutes of the interview, we learn that finding a store is not as easy as it could be. This interview combined with several others would lead to a strong case for making changes to the site.
Finding out the store location was relatively easy, but it generated some hypotheses for the investigator. Is the address information that's requested currently acceptable? How would John make the task process easier to use (more asterisks, less information required, something else?) The investigator had an opinion on how to make the task easier but the goal was to find out how John would do it, not to rely on the user for design work, but to begin to build a model of how several users expect the task to flow.
Investigator: All right. Let's go back two pages and look at the information. How did you feel about the type of information they wanted from you?
Participant: It's normal you know, whenever your doing a search for something, when you're trying to find a store, they normally ask you for your address, which is fine.
Investigator: Is there any way you could make this easier?
Participant: Um, not use the address itself, just the city and state, or city state and zip code. I didn't think it was necessary for me to put in the country.
Investigator: Do you think that it would work if you put in less information now?
Participant: It may, I'm not sure. There's nothing here to indicate that other than the country that you need to indicate for the store, to select one. So I could assume I didn't need the address. I could have taken off the address, I mean 200 Laurel Ave and the area code. I could have just put MT, NJ with the 07748 and or maybe just 07748 would work.
Investigator: Would you like to try one of those?
Participant: Sure, I'm going to do this, just the zip code. The country you have to leave in cause it's, they have an asterisk there and they're telling us you have to have a country (He hits enter button). And it did show me. (This time the nearest location to the Zip Code appeared instead of an intermediary page).
Investigator: Okay great.
Participant: It did show me the store information. So it wasn't necessary for me to put my whole address. Just the zip code was sufficient and the country.
Investigator: Okay, how would you recommend they fix that last page?
PARTICIPANT: I recommend they just have the zip code and the country.
John offered two ideas for how to make the task easier. It was important to test his ideas. John chose to provide the minimal amount of information, which hints that again, he is comfortable with exploring the site. John seemed to prefer efficiency rather than completeness. The investigator restated John's conclusion to make sure it was accurate to conclude that John preferred efficiency and to make sure he had completed his thoughts on the topic. Notice the investigator did not recommend a method to enter address information or show a preference for one of John's ideas. Also, by asking questions such as, "Is this what you expected?" and "How would you make it easier?" the investigator was trying to reinforce the idea that the web sites were being analyzed, not John. It is very important that the participant feel comfortable and not judged while conducting a cognitive clinical interview.
Also note that through this investigation, we learned more about the navigation process. First, we learned that by using the zip code and country only, the system did not get confused as it had when the entire address was presented. Also, the asterisk indicating that country information was required was not immediately obvious as the first time John answered this information he forgot to put in the country. That space was blank and had to be populated using a drop down menu box. Although he read the directions regarding the asterisk after he received an error message, it is unclear why the country information would be necessary. Therefore, if others also show this behavior, it may be better to set the country with the most stores/activity as the default.
In summary, by using the cognitive clinical interview, usability engineers can learn detailed experiences that their users encounter. By conducting several interviews with participants, common themes arise as problems and intuitive resolutions are presented by users themselves. In review of the technique for conducting the interview, there are several key components. Pre-defined tasks and hands-on material are essential. Easing into the interview with simple tasks is important to build the confidence of the participant. Reminding the participant that the material is what is being tested is important. On the spot hypothesis generation and testing by use of probing questions is key. The investigator must not assume how the participant thinks. Finally, the investigator must be able to follow the participants' lead in terms of letting them present their own experiences as support for their responses. Though many of these components are found in other forms of interviewing, it is the combination of activities that make the cognitive clinical interview unique.
© Internet Technical Group
Last update: April 30, 2000
hosted by Sandia National Labs
Disclaimer: Neither Sandia Corporation, the United States Government, nor any agency thereof, nor any of their employees makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately-owned rights. Reference herein to any specific commercial product, process, or service by trade name, trademark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by Sandia Corporation, the United States Government, or any agency thereof. The views and opinions expressed herein do not necessarily state or reflect those of Sandia Corporation, the United States Government or any agency thereof.