Crawford, 'Reality Checks for Catalog Design' URL: ftp://ftp.lib.ncsu.edu/stacks/serials/pacsr/pr-v4n06-crawford + Page 32 + ----------------------------------------------------------------- Public-Access Provocations: An Informal Column ----------------------------------------------------------------- ----------------------------------------------------------------- Crawford, Walt. "Reality Checks for Catalog Design." The Public-Access Computer Systems Review 4, no. 6 (1993): 32-35. To retrieve this file, send the following e-mail message to LISTSERV@UHUPVM1 or LISTSERV@UHUPVM1.UH.EDU: GET CRAWFORD PRV4N6 F=MAIL. ----------------------------------------------------------------- You're probably not an online catalog designer--but, given the flexibility of today's and tomorrow's systems, you might influence the design of an online catalog now or in the future. If that's true, here's a suggested change in your work life once such a design is in use: At least once a month, spend at least an hour or two studying current transaction logs for the catalog--preferably, reviewing whole sessions from beginning to end. I'd almost say that this practice should be required for any catalog or user interface designer: periodically examine how real users actually use the system. Library researchers call it Transaction Log Analysis (TLA); I call it a reality check for your design theories. Session Analysis, Not Statistical Analysis Most TLA work is statistical and can be quite useful (although probably not as useful as its proponents suggest). If the transaction logs and statistical analyses are well designed, the reports will show which indexes are most heavily used, which wonderful special features aren't used much at all, and whether anyone asks for help. All useful information, to be sure, but not the same as end-to-end session analysis. + Page 33 + Session analysis shows you how users actually cope with the system's design. Users make mistakes in every system (even menu-driven ones that mask the errors). Does the system's feedback instruct the user so that the next command is reasonable--and does the user retain that information for the next sequence? Do most users really do a large number of searches using the same index, or are they likely to follow a strategy through several different indexes? When do users appear to become frustrated--and why? Are there distinct differences between quick-search users and research users (and can you tell the difference)? Can someone start a session, complete a search, display the results, and leave in two minutes--and is that a reasonable goal for your system? If you have an open mind, and if the catalog's basic design is flexible, session analysis can yield improvements in the design. If you see common traps, you may see ways to avoid them, or at least to offer specific help. It's enormously satisfying to make design changes (or just changes in help text) based on session analysis, then see particular sources of error disappear in later sessions. The Awful Truth Session analysis can be depressing, and probably will be at times, no matter how good the design is. You'll probably find that some of the users (perhaps 2-5%) are incorrigible: they won't read what's on the screen, they won't pay attention to any help, and they will keep repeating the same errors no matter what you do. Some repetitive errors call out for system changes--but some sessions can only be attributed to abusive users. Session analysis shows the truth of a system: how people actually work with it. It isn't always pretty, and you can find yourself wanting to tell the phantom user the one tip that would cut through their confusion--but then, you may find that the system is already showing them that tip, and they simply ignore it. That, incidentally, is one of the problems with remote session analysis by outside researchers: if you don't know how the system operates, you may not be able to do a valid analysis. + Page 34 + Not Just Spouting Off Some previous Public-Access Provocations may have the appearance of theoretical pronouncements from someone not actually doing this stuff. Not this time. I've been doing exactly what I recommend since late August 1993, and intend to continue. I served as principal designer for Eureka, RLG's patron-oriented search service; we currently log 25% of sessions (fully maintaining user anonymity); and each week we print out those anonymous logs. I've been going through those logs each week to categorize erroneous commands (a process that is leading to changes in the design) and doing full session analysis at least once a month. Admittedly, Eureka is a functionally rich design, making session analysis both more difficult and more useful--but the lessons I'm learning appear to be lessons almost any online system could teach. Yes, the process can be irritating and frustrating, but it's also extremely enlightening. I grade each session based on my sense of what the user was trying to do--and I'm a tough grader: if they wind up frustrated, I assume it's the system's fault, even if it's clear that they had no interest in using the system correctly. Library researchers look at TLA as something that should lead to published results. Will I be publishing the results of this ongoing session analysis or the underlying statistical analysis? Possibly, eventually, but that's not the purpose. The purpose, and the best use of session analysis, is much more direct: to see how these reality checks can lead to better system design. And, to be sure, to keep system designers humble. Which we all richly deserve to be. More About TLA Library Hi Tech 11, no. 2 (1993) contains a special theme section on transaction log analysis: 7 articles in 70 pages. I disagree with a significant amount of what's said, but the section does provide a good introduction to the field. + Page 35 + About the Author Walt Crawford, The Research Libraries Group, Inc., 1200 Villa Street, Mountain View CA 94041-1100. Internet: BR.WCC@RLG.STANFORD.EDU. ----------------------------------------------------------------- The Public-Access Computer Systems Review is an electronic journal that is distributed on BITNET, Internet, and other computer networks. There is no subscription fee. To subscribe, send an e-mail message to LISTSERV@UHUPVM1 (BITNET) or LISTSERV@UHUPVM1.UH.EDU (Internet) that says: SUBSCRIBE PACS-P First Name Last Name. PACS-P subscribers also receive three electronic newsletters: Current Cites, LITA Newsletter, and Public-Access Computer Systems News. This article is Copyright (C) 1993 by Walt Crawford. All Rights Reserved. The Public-Access Computer Systems Review is Copyright (C) 1993 by the University Libraries, University of Houston. All Rights Reserved. Copying is permitted for noncommercial use by academic computer centers, computer conferences, individual scholars, and libraries. Libraries are authorized to add the journal to their collection, in electronic or printed form, at no charge. This message must appear on all copied material. All commercial use requires permission. -----------------------------------------------------------------