The Exchange, June 2004

Issue 14(2), June 2007

In this issue


The editorial gestalt

by Matthew Stevens (mls@zeta.org.au)

To do an effective job as a scientific editor requires you to immerse yourself in the story. You need to spread your attention across all levels of the document, from word meaning to sentence meaning to overall meaning, aiming to see all levels simultaneously rather than switching between them. You need to learn and understand the story being told, and view it as a whole composed of its parts. If the parts are discordant with the whole, you will then see this. This can take effort, and is harder with longer works and with more complex studies. You are looking for evidence that the conclusions presented are supported, the facts are real and consistent, the analytical tests performed are appropriate, the statistical tests are suitable, and the authors have not contradicted themselves. When you have perfected the gestalt technique, a moment comes in a job when you think, “Yes, I understand the study.” This is an important difference between substantive editing and copyediting.

A systematic approach to editing—reaching the gestalt

Scientific editing requires intense concentration and attention to a myriad of details, from punctuation and spelling to inclusion of all details (of a reagent, for instance), to citation of references, to format, to meaning, to logic. Covering all details requires several passes through a text, in which the editor must focus first on one class of details, then switch to another. Have a look at a photo of a colorful scene. How many red objects can you pick out? Now look again and see how many blue objects you can see. You’ll notice that you saw many blue items on the second pass that simply didn’t register on the first pass. This is because your attention was focused on the red objects before, and excluded other colors. Editing is the same—when we focus on one class of details, we automatically exclude others. Therefore, complex editing tasks must be broken down.

Over 2+ decades, I have found the best approach through trial and error and by learning from other editors. It might work for you too. Note that it is based on the use of a word processor, although most of what I describe below can be done on paper as well.

Multiple passes help

On the first pass, check that everything is present—title, authors and addresses, specified sections, references, captions, figures, tables and so on. Point-by-point checking of the publisher’s instructions to authors will greatly aid this process.

On the second pass, check the spelling. Checking the spelling up front flags the inevitable rare or non-standard spellings that require decisions. Knowing these before you start editing will allow you to ensure consistency throughout the job. It is less effective if you try to make a decision every time you come across an uncommon word while you are focused on meaning. If your word processor can check spelling constantly (MS Word: “Check spelling as you type”), then leave this feature permanently on. It will save you from introducing unnoticed errors, and the irritating red squiggles will remind you to verify any spellings you have deferred. A final spelling check is thus unnecessary.

The third pass is the most intensive. Start with the Abstract or Summary. However poorly written it is, it will set the scene for the document and allow you to grasp the intention of the work being described. It is essential to have an idea of the work up front. For readers, any document (fact or fiction) is structured to reveal meaning with the reading. Editors must know that meaning up front in order to ensure that the document reveals it appropriately. Therefore you must understand, at least vaguely, the story before you encounter it, so as to ensure that all parts work together to convey the meaning. Spend as much time as it takes to understand and edit the Abstract or Summary. With the story firmly in your mind, then, you are equipped to judge whether each part of the following text substantiates the summary, and to understand how each part fits into the story.

During the third pass you concentrate on local meaning—meaning at the level of the sentence or paragraph. Are words used correctly? Can you substitute a word with a more precise or relevant meaning? Does the punctuation signal the correct meaning, or does it mislead (for example, “My brother, John” does not mean the same as “My brother John”). Can you express the meaning in fewer words or more simply (often the same thing)?

For now, take any figure, table or reference citations as read. You can ignore these supports unless you need clarification of an unclear passage. You will attend to these later.

When you reach the end, you should have a document that is easy to read. All the sentences are grammatically correct; all the spelling is correct; every text element is where it should be. It might be nonsense, but it is an elegant piece of prose. All impediments to reading for understanding have now been removed.

Take a break

Now put it away if you can. Our brains process the day’s learning while we sleep, during our dreaming. Numerous studies have shown that people who have slept on new information are better able to recall it than people who have not slept on it. Obviously this is not always going to be possible, but try to structure your work so that you always come back after a first edit on the following day. If you don’t have time to wait till tomorrow, then try to punctuate the job with another task—another editing job, a walk, a chat over coffee. Coming back fresh, it never ceases to amaze me how many things I’ve stupidly missed on the first edit.

Before you put the job away for the night, though, make a fourth pass, checking all figure and table (or other) citations. Use the word processor’s Find function to search on “fig” and “table” (or “see”, “section”, “supplementary”, "reference", etc.). At the first occurrence, read the text that is relying on the citation to support it. Now go to the figure. Does it support what the text says? (A wide-screen monitor or a dual-screen setup is invaluable for this. You can have three or four full-page documents open side-by-side and compare them quickly. It sure beats constantly switching between documents and losing one behind the other. Or you can print out the figures and tables.) It can be surprising how often the figure or table says something else, even something contradictory. If you see an important point that the text does not mention, point this out to the authors in a comment—they will probably appreciate your drawing this to their attention.

Work your way figure by figure, then table by table. When you reach the end, you can be sure that when you come back tomorrow, every statement that relies on citations for support is correct (or flagged). This removes a distraction from the next stage, when you need to focus on the meaning and don’t need your attention distracted by having to confirm a point in another document. Whereas initially you took these statements at face value, now you know that they are correct.

During the fifth pass you concentrate on meaning, both local and overall. Everything you’ve done up until this point has prepared the document for critical analysis. All impediments to understanding have been stripped away, and the real meaning is laid bare, ready for analysis and critique.

Pick up where you left off

Start again with the Abstract or Summary to refresh your recollection of the story. If there are discrepancies between the summary and the text, you are more likely to spot them now, because your unconscious has had time to process what you read.

The vital role of this pass is to ensure that every aspect of the text works together to create a coherent picture. Critical thinking is essential. At every statement that you read, pause and reflect on it. Is it true? Does it fit with your world experience? Is it reasonable? Is it supported somewhere? Does it support other statements elsewhere in the text, or does it contradict them? Understand it. In the words of science fiction author Robert Heinlein, grok it. If it doesn’t gel, try again. If it’s still iffy, add a comment explaining your worry.

To do this properly, you need to have created a mind space in which you hold the whole story and can turn over and examine the various parts. This is the gestalt. When you’re in it, you feel as though you’re humming along in a well-oiled machine. Everything is lucid. This is why coming back with a fresh mind is important. It is also vital to understand the story, both the individual parts and their interactions. Reductionism is essential to seeing the parts. But holism is just as essential to seeing the whole.

Use mental imagery to understand complexity

When you are assimilating the various facts of the document, it can help to create a mental image of them. If it’s a description of protein behavior, create a mental image of proteins as they follow a schema that maps out the process. If it’s a description of drug treatments to mice, picture the mice in their laboratory cages with different colored pills raining down on them. If it’s a description of how water samples were collected, picture yourself with the bucket and the sample bottles and see whether the method would work as described. If it’s a description of population statistics, picture the relevant country and populate it. Whenever the text returns to a point that depends on one of these facts, it’s easy to bring up the mental image again and test the new information against it. If mental imagery isn’t your forte, then draw the concepts on a scrap of paper. It doesn’t matter how unrealistic your mental or graphic representations are, the point of them is to encapsulate the key points as they relate to one another (mind mapping is excellent for this), so you can compare other facts to them.

Build up the meaning piece by piece. Understand the local meanings first (such as key concepts within a sentence) then link them into a larger concept. Gradually bring these all together until you have a full picture of a passage. If you don’t understand it at first, persevere. The authors understand it because they lived and breathed the study for several years, so don’t expect that you should understand it all immediately. It’s likely that you will have to work on the text, first to understand it, then to get it to express what the authors meant. You have correctly assimilated the information when you can take any part of a passage and see how it relates to any other part; or when you can negate one part and correctly predict the consequence if that were to happen (for instance, if you change an increase in the value of a parameter to a decrease and understand what the dependent variable would do without recourse to the figures).

Having satisfied yourself that a passage is correct, move on to the next. If you later need to revisit an earlier passage (for example, to compare with a later one or to help you understand a later concept), remind yourself that you have already verified that passage. Scan the passage for key words to refresh your memory, then use that as a support to help you understand the later passage.

This pass will be quicker than the first, you’ll be looking at meaning more globally, you already understand much of the story, and you should have come back fresh, so contradictions (and confirmations) will be easier to spot this time through. In particular, you will notice any earlier statements that contradict a later statement, because you have now read the later statement, unlike in the previous pass.

At some point you must do a sixth pass to check references. Verify that all citations are referenced and that all references are cited.

Last but not least...

Finally, if you have inserted comments, do a seventh pass to check these. Here you will pick up any typos you made (potentially embarrassing for professionals who should know how to spell), but more importantly might be able to identify comments made redundant because you were able to clarify the meaning later.

Matthew Stevens (mls@zeta.org.au) lives in Australia, where he works as a freelance scientific editor. He is the author of the recently published Subtleties of Scientific Style, from which this article is extracted. The book is available at http://www.zeta.org.au/~mls/subtleties.html as PDF or hard copy.


Editorial: And now for something completely different—the 2007 conference report

by Geoff Hart (ghart@videotron.ca)

This time, I've decided to take a break from the usual editorial essay and try something a bit different: a report on some sessions I attended at the STC annual conference in Minneapolis. Because of various SIG-related business, ad hoc meetings, and the inevitable scheduling conflicts, I didn't get to attend all the sessions I wanted to see. Plus, there weren't many sessions specifically of interest to scientific communicators (a topic I'll return to at the end of this essay), so I was forced to pick and choose a fairly eclectic group of topics. Herewith, my selections and some of the more important and relevant points:

Keynote speech: Simon Singh

Simon Singh (http://simonsingh.net) is that rarest of birds: someone equally at home in the particle physics lab, writing popular science books, and directing documentary films. As the keynote speaker, Simon spoke about his documentary on Fermat's last theorem (http://simonsingh.net/The_TV_Film.html), and although I'll bet that very few audience members (possibly some of you!) had any prior interest in or understanding of the subject, he still managed to captivate the audience. This was due in large part to Simon's own considerable charisma, but he also made several points relevant to all technical communicators.

For example, he emphasized the importance for a film-maker of establishing a relationship and rapport with the person being interviewed—a point anyone who's interviewed scientists and engineers to get the facts will appreciate. Some of the key scenes obtained during filming would have been impossible without that relationship. Getting close to your collaborator in film is particularly important given that a typical hour-long documentary may have room for no more than about 8000 words; most of the remaining content will have to be carried by the interview subjects, and they'll carry that burden much better if they can interact naturally with you. You'll still have to cut a lot, and really focus in on the essentials, and that's a task all of us face. To satisfy his own fascination with the subject and provide the details that were inevitably cut from the documentary, Simon used his research to produce a book entitled Fermat's Last Theorem in the U.K. and Fermat's Enigma in the U.S.—an interesting difference in title resulting from someone's perception of the trans-Atlantic cultural and language barrier. For more information on Simon's books, visit his Web site (http://simonsingh.net/Books.html).

A somewhat thornier issue involved Simon's conscious choice to edit the words expressed by a speaker during an interview. In this specific case, a mathematician being interviewed used the word "primes" to refer to prime numbers. Given the lack of time and space to explain the meaning of this term, Singh replaced the word with "numbers" while editing the film. The ethics of this situation are of interest to scientific communicators because we often face similar issues on the job when we must simplify complex scientific information for a more general audience. Singh's take on this was that his edit did not substantially change the meaning, and thus did not misquote the author; more importantly, the final words conveyed the most important part of the meaning (i.e., numbers, not the specific type of numbers). Though I'm not personally comfortable with the ethics of his choice (changing a quote) and don't consider the change to have been necessary, I have no problem whatsoever with the larger principle of conveying only the key information to a non-technical audience, even if this requires more simplification than a scientist or a purist like me might prefer.

Case studies of publishing in special environments

Nicoletta Bleiel (nickyb@componentone.com) started this session with a talk on the use of templates as an aid in single sourcing. If you've already grappled with the issue of standardization, much of the material in this presentation will be familiar and relevant to you. Bleiel made some strong points that I've seen escape many other technical communicators. For example, even the best-designed template in the world may not be quite as clever as you'd like to think. The only way to tell for sure is to test the template using real sample text—an often neglected step. A few other points of particular interest:

David Caruso (dcaruso@cdc.gov), a health communications specialist with the National Institute for Occupational Safety and Health (http://www.cdc.gov/niosh/), spoke about his efforts in risk communication. Caruso's challenge was to present a wide variety of safety-related technical information to a widely varied audience in the mining industry. He faced many constraints in this work, including the need for multiple modes and media of communication (e.g., journals for mining engineers vs. videos for miners) and the ubiquitous problems of "my subject-matter expert (SME) is too busy to speak to me" and shrinking budgets. Many of the SMEs he works with must publish in peer-reviewed journals and give presentations at professional conferences to advance their own careers, yet the audience who actually benefit from their research (miners) will rarely if ever have access to this information. Thus, alternative means of information dissemination were required. At the time, NIOSH generated many printed reports and some very old-fashioned VHS videos that contained good information, but which clearly did not excite their audience enough to be routinely used; moreover, the two were packaged separately and often became separated. His solution was to implement packaging that kept the videos and text together in a single package; using the commercial model of the time (think Disney videos) of an oversized VHS case with a space for printed materials. This has since been replaced by the standard Amray DVD case which has clips to hold printed materials alongside the disc.

Caruso found it challenging to obtain the necessary content within the constraints of the NIOSH research project framework, and to convince SMEs that the technical content would not be unacceptably diluted by the need to simplify it for a non-SME audience. Thus, there was considerable need to educate SMEs about audience needs, basic communications principles, and information design, and to gain enough credibility with them to permit a change to the new approach. I won't go into more detail, since I'm hoping to coax Caruso into contributing an article for this newsletter, but his initial results have been highly promising.

Gluing their eyes to your screen

STC Fellow Karen Schriver (kschriver@earthlink.net) has gained a well-deserved reputation for performing insightful, comprehensive research and literature reviews, and presenting that research in a format that's palatable to notoriously theory-averse practitioners. Her talk built on her successful 1997 book, Dynamics in Document Design, by exploring how more recent research has improved our understanding of best practices in visual design. Some key points to keep in mind:

A viewer's emotional response to a design precedes their interpretation of the design, and is almost instantaneous (it takes about 50 milliseconds). For example, if a page looks dense and intimidating, viewers may be strongly demotivated to keep reading. This initial impression (positive or negative) can be difficult to change, and tends to be very consistent across a range of viewers, thus it's really true that "you never get a second chance to make a first impression".

Writing for the Web—2007 edition

STC Fellow Ginny Redish (ginny@redish.net, http://redish.net/) reviewed the published literature and her own research on "best practices" in writing text for Web sites. Her presentation focused initially on the topic of designing a Web page to increase the ease of finding information, and continued with information on increasing the effectiveness of a given presentation once the information has been found and will be read.

Redish started by reminding us about the need to emphasize our audience: an effective site focuses on the most important things visitors have set out to accomplish, not necessarily on our goals for creating the site. We often forget that their goal is never "visit our Web site"; instead, the goal is to accomplish something specific. As Redish noted, "People don't come to the Web for documents: they come for information." In this context, good content can be defined as content that is easy to find and reach, that answers the reader's questions, and that thereby solves their problem. Readers also prioritize currency (information that is updated with an appropriate frequency), ease of use, and high quality. For any Web page that is time-sensitive, it's important to assign an owner to the page who will ensure that the page remains up to date—or who will remove it from your site when it is no longer relevant.

People achieve their goals by scanning, skimming, and selecting what to read in more detail: statistics show that people spend an average of 25 to 30 seconds on a home page, 45 to 60 seconds on interior pages reached from a home page, less than 2 minutes in total before deciding whether to abandon a site or keep looking, and about 4 minutes actually doing the reading once they find what they're seeking. (These data come from Nielsen and Loranger's 2006 Prioritizing Web Usability.) Clearly, we don't have much time to attract and hold their attention! Readers often won't scroll below the bottom edge of the screen unless we give them a good reason to do so; thus, it's important to keep as much important information as possible within the top of the screen, at least until we get to a page that is designed for extended reading. Pathway pages take advantage of this principle: they serve the primary goal of getting people to their next destination in a hurry, and should thus emphasize information that accomplishes this goal. Experience has increasingly shown that we shouldn't expect people to read extensively before they reach their goal, and that we must design to support browsing instead. Redish's new book Letting Go of the Words (www.redish.net/writingfortheweb) will be of particular interest to those of us who are word geeks, yet who must also design Web pages: the book will provide many strategies for minimizing the quantity of text and thereby increasing the effectiveness of your Web content.

Redish noted an important distinction between information on the Web and in print: even though readers always interact with text to some extent, reading on the Web is much more clearly a dialogue between the reader and the Web site because the Web site responds to questions (i.e., by letting us follow links or use search tools). This suggests that the kind of thought process used in think-aloud usability studies can be applied to Web design: if you can understand the kind of question a reader will ask, you can write to answer that question and the questions that follow logically from that first answer. In such an interaction, it's important to "give people the scent", a metaphor borrowed from the use of bloodhounds to track people: in the context of the Web, the "scent" is the ensemble of clues that reassure browsers they're following the right path and should keep going. People will keep browsing and clicking and scrolling only so long as they're confident they're headed in the right direction. One complication in such dialogues is that you're not there to explain and revise your text, thus you must carefully manage the reader's initial interpretation: readers actively guess what you mean when they read headings and links, and those guesses are often biased by their current mental state.

One commonly advocated best practice for Web design involves converting titles into links rather than using "click here" links, thus online titles must be clearer than their on-paper equivalents, particularly if there won't be much context beyond the title to convince a reader to follow a link. This means that greater care is required when choosing a link title. "Hover text" that appears when you hold the mouse cursor over a link can provide additional information that clarifies the meaning of the link, but because the presence of this text is not obvious and because displaying it requires an additional step, it's not as effective a solution as choosing a clear title in the first place. The choice of link text can be even more important for certain audiences. For example, blind users "scan with their ears", and sometimes choose to "display" (i.e., read aloud) only the links on a page when using screen-reader software. (Interestingly, this follows the skimming approach used by sighted readers, but using a different sense.) Thus, links should be distinct: they should not all begin with the word "link", and should emphatically not all say "click here".

Copies of the slides from this presentation can be obtained from the STC Web site: <http://www.stc.org/54thConf/sessions/sessionMaterials05.asp?ID=67>.

Information architecture for mobile devices

Ever-popular speaker Bogo Vatovec (bogo_stc@bovacon.com, http://bovacon.com/) discussed some of his research and design work for portable devices such as cell phones. Given the small screen size in much scientific and technical equipment, and the increasing use of portable devices such as handheld dataloggers, his presentation provided some potentially important lessons for scientific communicators.

Challenges faced by designers start, most obviously, with the small screen size: the devices wouldn't be portable if they had a large screen! But designers also face challenges that arise from differences in keyboard features and operating systems. Worse yet, market forces lead to endless revisions and updating of portable devices to include the latest and greatest features. So even within a given company's product line, you may have severe backward-compatibility issues between versions of the same hardware. Memory and processing power are also severely limited, even for current-generation equipment, so minimalist design becomes especially important.

Because we rarely have any control over the display device, except in the limited situation of dedicated hardware and software combinations, we must emphasize content over form. This may mean, for example, that we must adopt "fluid design" principles, such as designing text to reflow to fit the screen size rather than hardwiring text and layout for a specific screen size. Our lack of control over the output device, combined with the relatively low processing power of these devices, means that some products simply won't work on a portable device: there will be no HD-ready Flash animations available for your cell phone any time soon, particularly if the file must be downloaded over an erratic cell connection rather than uploaded to the device over a fast USB connection. Where graphics are important, they must be redesigned to fit the screen; resampling is rarely satisfactory, particularly when middleware (software that lies between a server and the portable device) makes its own attempts to optimize a graphic. For some special-purpose applications, it may be better to write your own software for the mobile instead of relying on built-in software (such as Web browsers) so that you can control both the content and the presentation.

Because portable devices do not permit large pages with multiple columns, it becomes important to abstract information into its simplest functional components, and design around those components so you can more easily remap the information onto the available screen space—essentially, all we have available is a one-column display that supports simple text better than complicated layouts. In converting tabular or multi-column information from a typical  overcrowded Web page, we may find that it's necessary to convert each cell or column in the grid into a single screen on the mobile device. Sometimes the content must be entirely different because it's not possible to map a single chunk of content between two radically different uses; the in-office and on-the-road tasks and the contexts in which those tasks are performed may simply differ too greatly. This suggests that some information can be single-sourced, but other information must be custom-designed for mobile use. Although tools exist to automatically convert content between in-office and mobile devices, they don't yet work well or consistently (particularly if the designer did not think of the possibility of using the information on a mobile device), and may not do so for a long time to come. Some things convert better than others: a paragraph of text works well, for example, but a complex home page will probably have to be redesigned from scratch.

Before developing or repurposing information for a mobile device, it's important to carefully think about your goals and the constraints that might stop you from reaching those goals. Creating a business case is a very useful tool to support this thought process because it forces you to consider whether your ideas are economically feasible. If you are forced to create two entirely separate sets of information, that can effectively double the effort required to generate and maintain the information. (Vatovec did not say "double"; I chose that word to dramatize the potential magnitude of the cost increase if you double the amount of information you must create and maintain. The actual increase may be even greater.) If you decide to proceed anyway, you must be confident there are enough users to justify the high cost of creating and maintaining a new body of information, particularly if you must support a diverse range of mobile devices. Here, more than in many other applications, you must have a deep understanding of the task you're designing the information to support and the context in which that task will be performed. Then, given the limitations of a typical portable device, you must design based on the absolute minimum amount of detail you can provide that will support that task; whereas superfluous details can be supported on a 17- or 19-inch monitor, they must be ruthlessly eliminated for a portable device. The HTML Dog site (http://HTMLdog.com) offers a report on how CSS and HTML work on a wide range of mobile devices, including different models and versions of a given device, and this represents a great resource for this kind of design project.

It's important to remember that portable devices are much less usable than office computers: the input devices are smaller and harder to use, and the small screen size creates a need for considerably more scrolling and clicking. Where possible, you can minimize the burden on the user by identifying the most likely choices and using them as default values, and choosing options such as pick-lists to minimize the need to type. Providing clear feedback on the status of an operation is even more important than on an office computer, since the response latency may be much higher for mobiles due to a combination of low CPU power and low bandwidth for transferring data. Bandwidth constraints also suggest that it may be more efficient to download relatively large pages and use scrolling rather than breaking the information into many chunks, each of which must be downloaded separately; however, we face a tradeoff because older devices may have insufficient memory to receive a single longer page, forcing us to break information into multiple pages.

Because the small screen provides much less context, it's easier to forget where you are, thereby greatly increasing the risk of becoming "lost in cyberspace". Thus, it's helpful to repeat navigation elements that minimize the need to scroll back to the top of a page just to remind yourself where you are. Making context explicit (e.g., using breadcrumbs) helps. In this context, using a zoom-in architecture (with few options per screen, but a deep hierarchy of screens reached from these few options) works better than the broad architecture that has been proven to work well on the Web. Using flowcharts is a great way to test such an information architecture before you actually try to implement it.

Information retrieval

Kristine Henke (khenke@us.ibm.com) and Korin Bevis (kbevis@us.ibm.com) talked about developing ways to organize and categorize information so it will be easier to find. They noted that search tools are becoming the premier means of finding information online, but that people still do navigate using hierarchies. To support both approaches, they talked about classifying information to support progressive refinement of a search, gradually narrowing down the list of hits, or (when following a series of links) to ensure that the hierarchy is clear throughout the navigation. If you've studied biology before, you can think of this as akin to working your way through a dichotomous key to identify an organism. (And if you're not familiar with keys, here's one useful primer: http://en.wikipedia.org/wiki/Dichotomous_key).

Good classification begins with an appropriate division of information into higher-level categories. A familiar model for online help might be a division of topics into tasks, concepts, reference material, examples, and scenarios. Each of these categories can then be subdivided into subcategories, then into further subcategories, eventually reaching a single topic or small group of topics, with each topic presenting only one idea or focusing on only one thing. Consistency in defining the various categories is an important part of making this approach usable. For example, all tasks could begin with gerunds (printing, finding, saving, etc.), whereas all examples could begin with the word "Example:" and all references could be written as noun phrases. This consistency helps readers recognize more quickly where a given topic belongs among the categories. Topics can also be organized hierarchically into "topic collections". The problem with hierarchies that aren't based on taxonomy is that even when they're logical and defensible, they may simply not match the hierarchy that is assumed by users of the information.

Taxonomies help, since they clarify the relationships among categories and subcategories sufficiently well that users of the taxonomy can figure out where they are and where they're going. (This clearly relates to the concept of "scent", discussed earlier in this article under the Redish presentation.) A good taxonomy requires clear parent-child relationships, and consistent use of controlled terminology to assist in the recognition of those relationships. For example, you could classify animals as pets vs. non-pets, subclassify pets into cats and dogs, then subclassify cats into long-haired and short-haired. You can design a hierarchy to support a specific purpose, borrow and modify someone else's hierarchy, or evolve a hierarchy based on the actual content. At IBM, Henke and Bevis and their colleagues chose the latter approach, which required careful analysis of the details of every document in the document collection being classified so as to create mutually exclusive subject groups (also called "facets"). In listening to their description, it occurred to me that such an analysis resembles the thought process underlying indexing in many ways, and must be similarly systematic and consistent—indeed, indexers can provide good guidance in this analysis because they have considerable expertise in this kind of analytical process.

Once you have an initial collection of topic keywords, it's important to begin testing them with future users of the taxonomy: apart from ensuring that your taxonomy will be usable, it's also much easier to revise a taxonomy at this early stage than after the information has been classified, because reclassifying information requires re-examination of all documents to account for the proposed changes. Usability experts can help in this phase of your analysis, since they have considerable expertise identifying common problems. However, before wasting anyone else's time, it's important to do your own tests so you can eliminate problems that are only obvious after you've already created them. Parsing a taxonomy (thinking through the meaning of the relationship between two levels in the hierarchy) is a useful tool for accomplishing this. Using the abovementioned animal example, you could parse the hierarchy as follows: "A short-haired cat is a kind of cat, a cat is a kind of pet, and a pet is a kind of animal." All information in the hierarchy must follow a similar logic. Illogical results identified by this type of analysis suggest a problem with the hierarchy. Taxonomies should also permit inheritance: if you define something as a cat, it can automatically be classified as a pet and an animal. As in all usability testing, it's important to not focus exclusively on your own artificial tasks: real users inevitably reveal things that you've missed or misunderstood. When testing their performance, pay close attention to learning at what point they would give up on a search: real-world users give up long before people who know they are being tested and watched.

This kind of hierarchical taxonomic classification works best with software that lets you progressively narrow down your results by repeating the search within each set of results. For example, after finding all animals in your initial search, you could narrow your search to include only animals that are pets, then only pets that are cats, then only short-haired cats. The taxonomy you have developed can also be used to create the kind of expanding and contracting tree structure found at most online stores: as you select a category, the site automatically presents all relevant subcategories, allowing you to dynamically narrow your options until you obtain a manageable number of products to compare. An advantage of this approach is that it facilitates backtracking if you've gone too far down the wrong path. In such structures, displaying status indicators such as "page 2 of 12" or "203 hits" helps users recognize when they have narrowed their results enough that they can stop searching and start reading. The granularity of the classification (how many subclassifications are permitted or required) depends both on the nature of the content and how it will be accessed. At very fine levels of granularity, where the hierarchy can grow fairly deep, it's helpful to ensure that the context is clearly visible (e.g., use breadcrumbs to show the path followed to reach a certain point).

Maintenance is also an issue. Taxonomies must be reviewed periodically to ensure that they remain relevant and correct, and to account for new information that doesn't fit into existing categories. This allows you to maintain the categories when necessary. Long-term monitoring of the use of your taxonomy will also reveal whether users are learning to use it, or whether problems you hoped would disappear are stubbornly persisting, possibly suggesting the need for a redesign. Documenting your decision processes and the decisions that resulted from these processes will help future workers understand what you've done and how to use that knowledge to classify new information or revise the taxonomy.

A call for papers!

I started this essay with the complaint that there wasn't much specifically related to scientific communication at the conference. In my view, there's only one way to fix this: for us to participate. If you're interested in speaking at next year's conference, drop me a line and tell me what areas you'd be interested in presenting; if you're only interested in attending someone else's presentation, send me a list of topics that would encourage you to attend the conference and I'll try to find suitable speakers. The process for selecting the conference program has changed radically in the few years since I served on the program committee, so I can't make any promises; proposals are still screened in (or out!) based on their quality, but the program committee also excludes otherwise-suitable topics if they simply don't fit into the larger theme for the conference or the overall "user experience" they are hoping to achieve for a particular theme (such as usability).


Ethics in scientific and technical communication

by Jean Hollis Weber (jean@jeanweber.com)

An earlier version of this article was first published in WISENET Journal 38:2-4 (July 1995).

For more than 30 years I have edited scientific and technical materials aimed at both specialist and general audiences. I have also done a lot of technical writing over the past 20 years and, more recently, taught professional writing (mainly technical writing).

During this time I have witnessed or been faced with many situations that I would describe as ethically suspect, and I have read about many more.

This article is a brief summary of some general categories of ethical issues. None of this material is original; many articles have been published on ethics in communication. Each topic could be (and has been) the subject of whole articles.

Knowing and doing—taking personal responsibility for one's actions

The Society for Technical Communication runs an ethics column in Intercom four times per year. A typical situation from the workplace is described and readers are invited to choose which of several proposed solutions they would follow (or to propose a different solution). The responses are published in a later issue.

One thing that clearly emerges from these columns, and from newspaper articles and discussions with scientific and technical communicators, is the dichotomy between knowing what's right and wrong and applying that in situations where your job, or possibly even your life, could be placed in jeopardy if you do the right thing. We've all read about the "whistle-blowers" who are demoted, sacked, harassed, and so on, for making public some information that someone in power did not want known.

I think that all ethical questions boil down, at some point, to accepting personal responsibility for one's own actions, not hiding behind "I was only following orders" or "that's not my job" or some variation on those themes.

Is it legal? Is it ethical?

Legal and ethical are not synonymous. Slavery was legal in parts of the USA until the Civil War. Australian law seriously restricted women's rights until fairly recently. Wife-beating is still legal in some parts of the world. Emerging technologies mean that the law often is well behind the times; but we must make choices now, not wait for the law to catch up.

We read particularly about issues in genetics and medicine, but there are plenty in scientific and technical communication as well. Computer technology and the Internet have given us the ability to access, distribute, and copy information more quickly and easily than before. Censorship is difficult; so is policing intellectual property rights.

"But it's not illegal" is no excuse for failure to accept personal responsibility for your ethical choices.

Behavior toward colleagues, subordinates, and others

One general category of ethics in communication covers such things as:

Plagiarism versus credit for work done by others.

We all know about people in power taking credit for work done by colleagues or subordinates; it seems to be a common part of the way business is done. It's especially common when the powerful person is a man and the less powerful one is a woman.

Harassment and undermining of a person's position.

This covers a multitude of behaviors, some extremely subtle, such as the constant and deliberate misinterpretation or misrepresentation of someone's actions. ("She comes in late and leaves early; she isn't pulling her weight or isn't serious about her work", when in fact she is working evenings at home.) Volumes have been written about the application of this ethical misbehavior to keep women in subordinate positions.

Stupid vs. malicious actions.

Everybody makes mistakes. Do not jump to the conclusion that an action, no matter how awful, was deliberate (unless, perhaps, that person has a history of malicious actions). Most ethical misconducts are genuine mistakes; someone didn't think about the consequences of their actions or the fact that something might be misinterpreted.

Another common situation is the dilemma of "if I tell the truth, someone will be hurt by it". Which is more important? Depends on the situation. Your interpretation of what they should have done might be quite different from mine.

Dealing with experimental subjects, interviewees, etc.

The whole issue of informed consent and ethical behavior in dealing with experimental subjects, interviewees, etc. is too complex to be covered in this short paper. Nonetheless, it's an issue we must be aware of and must research if it affects our work.

Telling the "truth"

Here's an area that seems to be clear-cut:

Less clear-cut, because they are not always so easy to do, are the following:

Rhetoric: choosing your words

Even if you have the facts, you can distort the message, either deliberately or accidentally, through such techniques as:

Using loaded words

Terms like admitted instead of said or stated (admitted makes the speaker sound reluctant, as if he or she would prefer to hide something), or alarming and dramatic when a statistical increase or decrease is fairly small. I don't admit that I am a feminist; I may proclaim, announce, or state it.

Using discriminatory language

STC members will be only too aware of this practice.

Using sentence structure to convey subtleties of meaning

Here are two statements that could be made about a co-worker:

"Jean's work is slow, painstaking, and meticulous."

"Jean's work is meticulous, painstaking, and slow."

The first sentence leaves one with the impression that Jean might not be the speediest worker, but her results are excellent. The second sentence suggests that although Jean's results are very good, she takes far too long to achieve them.

I'm sure you can think of similar statements that could be made on a variety of topics.

Sensationalizing

This is related to the use of loaded words, mentioned above. To get the reader's attention, one often feels the need to find something dramatic or sensational to say. When is this ethical, and when isn't it?

Using logical fallacies

Presenting something as proof when it is only evidence is one very common logical fallacy. Sometimes it's caused by a lack of understanding, but other times it is deliberate. Other logical fallacies include taking things out of context and jumping to conclusions; there are many more.

How much detail is appropriate?

A common question in scientific and technical writing is how much detail to include. In many cases lack of space requires omission of detail. In other cases, the writer makes a judgment that the reader doesn't need to know the detail, or that the detail is more than the audience will understand. Sometimes these judgments can be used to hide information that you don't want to disseminate, but in most cases the writer isn't trying to hide anything. The writer's job is often to explain a complex topic at an appropriate level for the audience.

For example, users of a word processing program do not need to know, and in most cases do not want to know, how the computer or the program work. The users simply want to know how to use the program to accomplish their tasks. So the writer chooses only that subset of information that contributes to this goal, and includes a reference to a technical manual or other source for those users who do want to know the computing details. The choices are more complex when summarizing results of scientific papers or studies in the press, or in Environmental Impact Statements, and so on, because of some of the problems noted earlier under Rhetoric.

Choosing between advocacy and objectivity

Some scientific and technical communication is clearly advocacy. A public health campaign in Australia that urges people to "slip, slap, and slop" (slip on a shirt, slap on a hat, and slop on sunscreen, to avoid developing skin cancer) simplifies the issues by cutting out most of the detail, and it certainly is advocating a course of action. A discussion paper on the causes and prevention of skin cancer should cite differing opinions and give references.

A major criticism of much of the information on smoking is that it's not "objective"; it clearly takes a pro- or anti-smoking stance. Whether that is good, bad, or indifferent is a matter of opinion, but it is certainly an ethical issue (without, in my view, a clear-cut answer).

Again, you can surely think of numerous other examples.

No easy answers

My purpose in this article has not been to provide any definite answers, because ethical situations are complex and nuanced, and each requires careful thought. Instead, my purpose has been to summarize some of the more important considerations you should keep in mind when you begin that thought process. A good way to practice these skills would be to use this article to help you work through one of the ethics case studies presented in Intercom. But don't stop there: once you've acquired these skills, see how you can use them in your job!

Jean Weber (jean@jeanweber.com) is a technical writer, editor, and publisher who lives in Australia. For many years she worked for government entities and businesses. Today she is actively involved in producing documentation for OpenOffice.org, an open-source alternative to Microsoft Office. Jean is the author of Is the Help Helpful? How to Create Online Help That Meets Your Users' Needs (published by Hentzenwerke, 2004), which won an award of Excellence in the STC's International Technical Publications Competition, 2006. Her Web sites include The Technical Editors' Eyrie, <http://jeanweber.com>.


A new world record

by Edgar L. Andreas (eandreas@nwra.com)

The paper that Shen et al. (2006) recently published in Geophysical Research Letters has just been included in the Budweiser World Book of Records as the best example for the misuse of acronyms and abbreviations ever to appear in the annals of science. In announcing the award, World Book spokesperson Gobb L. de Gook commented that, “We could not let this article fade into posterity unheralded. The Shen et al. paper is nothing short of reader abuse.”

In a paper of only five pages, Shen et al. introduced the following 28 acronyms and abbreviations: AIR, AOML, CCM3, CP, DE, fvGCM, GCM, GFS, GFDL, IC, MM, MPI, MSLP, MSW, NASA, NCAR, NHC, OBS, PP, RH, RI, RMW, SST, TC, T384, WRF, UTC, and ZETAC. De Gook elaborated that, “The two-letter abbreviations, especially, caught the attention of World Book scouts because they are clear evidence of lazy authors.”

R.A. Day (1998, p. 222) advises that, “You would be wise to keep abbreviations to a minimum. The editor will look more kindly on your paper, and readers of your paper will bless you forever.” He further concedes, however, that some recognizable acronyms and abbreviations are acceptable. Therefore, to authenticate the merit of the Shen et al. paper for recognition, World Book staff showed the above list of acronyms and abbreviations to 50 scientists whose demographics mirrored the readers of Geophysical Research Letters and asked them to identify as many as they could. One respondent actually got 19 correct, but he turned out to be a junior author of the Shen et al. paper. Among uninvolved respondents, the average number of correctly identified acronyms and abbreviations from the above list was 11.

To conclude, de Gook pointed out that Shen et al. cannot take full credit for their World Book recognition. They were helped in getting their manuscript into print by the technical editor and the reviewers for Geophysical Research Letters, who judged their science to be good but turned a jaded eye from the low quality of their presentation. The copyeditors for the American Geophysical Union also contributed notably to this world record by denying their professional training for the sake of expedience and to keep papers short.

References

Day R.A. (1998), How to write and publish a scientific paper. 5th ed. Oryx Press, Phoenix, AZ. 275 p.

Shen B.-W., Atlas R., Reale O., Lin S.-J., Chern J.-D., Chang J., Henze C., and Li J.-L. (2006), Hurricane forecasts with a global mesoscale-resolving model: Preliminary results with Hurricane Katrina (2005). Geophys. Res. Lett. 33:L13813, doi: 10.1029/2006GL026143.

Edgar L. Andreas (eandreas@nwra.com) is a senior member of STC. He has a Ph.D. in physical oceanography and M.S. and B.A. degrees in physics. He has been doing research on turbulent exchange processes over land, oceanic, and sea ice surfaces for over 30 years; this research has taken him inside the Antarctic Circle twice and inside the Arctic Circle four times. He is the lead author for 67 papers published in refereed scientific journals. Some of these articles have won STC publication awards at both chapter competitions and at the International Technical Publications Competition.


Book review: Writing in the health professions

Heifferon, B.A. 2005. Writing in the health professions. Pearson Longman, New York, NY. [ISBN 0-321-10527-3. 336 p., including index. US$53.33 (softcover).]

by David Armbruster (darmbruster@utmem.edu)

Previously published in May 2006, Technical Communication 53(2):256-257.

On a recent flight, my seatmate, a young engineer, noticed what I was reading and asked, “Why is writing in the health professions different from writing in any other profession? Why does the topic require a separate text?” Based on long experience in both the health care and technical communication communities, Barbara Heifferon’s Writing in the Health Professions provides a credible answer: The primary reason for such a text is the variety of unique documents, audience mix, and legal and ethical issues found in the health professions. At the same time, she convincingly explains, all the theories and techniques, formats and styles, and guidelines and studies regarding technical writing are germane to writing in the health professions.

This book is well worth reading by those communicating in most aspects of the health professions. The one major exception is those writing peer-reviewed research articles. Several books deal with these writers and situations, and Heifferon has chosen, wisely, to focus on other kinds of writing.

The book is an excellent addition to the Allyn & Bacon Series in Technical Communication, edited by Sam Dragga. Both he and the author are to be commended on this contribution to the literature.

With only 11 chapters, Heifferon’s book is easily covered in a one-semester course. Her writing style is conversational, yet professional. She knows the technical writing literature and cites appropriate specialists throughout the book—much to her credit. Because there is not a logical order to health profession documents, it is easy to skip chapters that don’t pertain to your particular situation or interest. The chapter on multicultural and international medical writing raises many issues (for example, learning a second language), some of which are only minimally discussed. Yet complete answers are difficult to come by, so many additional resources are provided. The chapter “Presenting written materials visually” is especially informative and instructive for those new to medical and scientific communication.

The introduction to each chapter provides questions that are answered in the chapter and then summarized in the excellent chapter conclusions. Heifferon uses numerous examples from her classes and medical and communication experiences to bring currency and reality to her points. At the end of each chapter, she provides thought-provoking and varied discussion questions and exercises, which can be answered individually or by groups.

An outstanding feature of this book is that Heifferon deals with professional, medical, and publication ethics in Chapter 2. She doesn’t put this critical topic at the end of the book where, if it is used as a text, the topic is easily passed over at the end of the semester. Ethics is up front, important, and emphasized throughout the book.

The book is appropriate for an advanced undergraduate or graduate technical communication course. But I also believe that it could be used by health science paraprofessionals and professionals who have communications responsibilities but who don’t have a technical communication background. Certainly, assumptions are made by Heifferon (and the series editor) as to your understanding of technical communication theory and practice. Nonetheless, Heifferon provides enough information and references so that someone outside the technical communication field can, in fact, learn to communicate productively in the health professions.

Problems are minor, and some are probably proofreading errors. For example, she cites the Uniform Requirements developed in 1997, but the reference is 1992. (They were actually promulgated in the late 70s, so maybe 1997 should have been 1979.) A complete list of references at the end of the book, in addition to the end of each chapter, would be helpful. Explanatory notes at the end of each chapter are a bother; footnotes would avoid forcing readers to flip back and forth.

Figures occasionally appear before their text callout, which I found distracting. Additionally, the placement of tables and figures is not always intuitive and is occasionally wrong, as with “in the map above”, which is actually below. Despite the plethora of topics covered, many excellent references are provided for more in-depth study. Again, the author knows and cites the experts in the specialized areas of document design, cultural issues, graphical presentation of data, etc.

The book contains very few typos (a significant achievement by any press today, in my opinion) and signifies great attention to detail by all concerned, unlike in some earlier books in this series.

The only major omission is a chapter on pharmaceutical industry publications. Although documents in this area can be arcane and daunting, they are a large part of writing in the health professions. They deserve more than a page in Chapter 8 that describes documents prepared in pharmaceutical companies. Heifferon provides a good reference, but none here to the American Medical Writers Association, many members of which write for pharmaceutical companies.

Finally, Writing in the Health Professions is a real contribution to a rapidly growing field.

David L. Armbruster (darmbruster@utmem.edu) is head of scientific publications at the University of Tennessee Health Science Center, where he has taught for more than 20 years. He is a Fellow and past president of STC.


Book review: Wordless diagrams

Holmes, N. 2005. Wordless diagrams. Bloomsbury Publishing, New York, NY. [ISBN 1-58234-522-8. 160 p. US$14.95.]

Previously published in February 2006, Technical Communication 53(1):101.

by Charles Crawley (crcrawle@rockwellcollins.com)

Nigel Holmes is not a name that will ring a bell unless you are a middle-aged technical communicator. But he is someone you want to know, whether you are experienced or new to the profession. Holmes was graphics director for Time magazine in the 1980s and was responsible for many of the information graphics that we now take for granted. For example, many of the charts and diagrams that you see in USA Today were pioneered by Holmes. Since that time, he has moved into the private realm and runs his own firm, Explanation Graphics Company (http://www.nigelholmes.com).

A disciple of Gerd Arntz, Otto Neurath, and Rudolf Modley, Holmes wrote a trio of books on graphic design: Designer’s Guide to Creating Charts & Diagrams (Watson-Guptill Publications, 1984), Designing Pictorial Symbols (Watson-Guptill Publications, 1985), and Pictorial Maps (Watson-Guptill Publications, 1991). But that last book appeared 15 years ago. So it was with great joy that I picked up his little book Wordless Diagrams and remembered how much I had learned from him.

Technical communicators who create or re-create graphics will benefit from this completely wordless book (wordless, that is, except for the front matter). Holmes has chosen a wide array of practical, humorous, and interesting concepts to illustrate. For example, “How to make a grilled cheese sandwich” (30-31) shows seven frames, with directions including materials, movements, sequence, and time. As for humor, he devotes one page to “How to drive while using a cell phone” (71), with a person driving a car talking on a cell phone—with a big red prohibition symbol around it! And in terms of interesting concepts, “How to cremate a body” (140-141) will teach you something that David Macaulay didn’t in The Way Things Work (Simon & Schuster, 1973). This series includes the temperature of the crematorium and the use of a magnet to remove metallic objects.

Another group of readers that may benefit from this book is teachers. As an exercise for a college-level professional writing class, I showed the class several how-tos, including “How to fold a t-shirt” (42-43) and “How to change a disposable diaper and a cloth diaper” (152-153). Then I had the students write instructions to accompany the diagrams. The students had fun and learned something about writing directions.

This book has 158 series of wordless diagrams and moves in the direction many companies have already adopted. Hewlett-Packard, for example, now has wordless diagrams for changing laser toner cartridges. While writers needn’t abandon their pens for brushes (or Word for PhotoShop, although that might be nice), we can learn much about good communication from the esteemed information designer Nigel Holmes.

Charles R. Crawley (crcrawle@rockwellcollins.com) is a lead technical writer for Rockwell Collins in Cedar Rapids, IA, and served two terms as Eastern Iowa chapter president. He has reviewed books for Technical Communication for over 10 years.


Parting thoughts

"Nothing comes harder than original thought. Even the most gifted scientist spends only a tiny fraction of his waking hours doing it... The rest of the time his mind hugs the coast of the known, reworking old information, adding lesser data, giving reluctant attention to the ideas of others... warming lazily to the memory of successful experiments and looking for a problem—always looking for a problem, something that can be accomplished, that will lead somewhere, anywhere. There is, in addition, an optimal degree of novelty in problem seeking, one that is difficult to measure and follow. Stick to the coast too tightly and only minor new data will follow. Venture out of sight and you risk getting lost at sea. Years of effort might then be wasted, competitors will hint that the enterprise is pseudoscience, grants and other patronage will be cut off, and tenure and election to the academies denied. The fate of the overly daring is to sail off the rim of the world."—Edward O. Wilson, The Drive to Discover

"In their aspect and individuality, our ideas are signs, not portions, of what exists beyond us; and it is only when experiment and calculation suceed in penetrating beneath the image, that (for instance, in mathematical physics) we may gain some more precise, although still symbolic, notion of the forces that surround us. We and our knowledge are a part of nature: it is therefore inevitable that the rest of nature, in its concreteness, should be external to us... Absolute truth is hidden from us, and the deeper our science goes, the more ghostly it becomes."—George Santayana, Spirit in the Sanctuary

"Science quite appropriately acknowledges that error should be assumed, and at best it proceeds by a continous process of criticism meant to isolate and identify error... [but] as the history of eugenics proves, science at the highest level is no reliable corrective to the influence of cultural prejudice but is in fact profoundly vulnerable to it."—Marilynne Robinson, Hysterical Scientism: the Ecstasy of Richard Dawkins

"It is diversity that makes any natural system robust, and diversity that stablizes culture against the eccentricity and arrogance that have so often called themselves reason and science."—Marilynne Robinson, Hysterical Scientism: the Ecstasy of Richard Dawkins

"More than any other time in history, mankind faces a crossroads. One path leads to despair and utter hopelessness. The other, to total extinction. Let us pray we have the wisdom to choose correctly."—Woody Allen, author actor, and filmmaker (1935-)

"... when reason detaches itself from its material roots and grows hubristic, falling prey to a belief in its own autonomy, it becomes simply another form of irrationalism."—Terry Eagleton, The Enlightenment is Dead! Long Live the Rnlightenment!

"In science, it is less often deception than it is self-deception we must guard against; but in science and beyond, the best stance is to seek that clear ring [of truth]. Once sought, once heard, we ourselves have changed."—Philip and Phylis Morrison, The Ring of Truth

“That is the essence of science: Ask an impertinent question, and you are on the way to a pertinent answer.”—Jacob Bronowski

“Living in an age dominated by science, we have come more and more to believe in an obejctive, empirical reality and in the goal of reaching a complete understanding of that reality. ... But we’re fooling ourselves. Most of these comprehensive theories are no more than stories that fail to take into account one crucial factor: we are creating them.”—Robert Lanza, A New Theory of the Universe

“The scientist, for [botanist Paul B.] Sears, is as susceptible as anyone else to the deforming impact of prejudice or expectation, and whenever ‘his feelings enter into his observations or his calcations, the truth he is after is imperiled’.”—Richard Nicholls, The Scientist’s Fresh Eye

“The most important scientific revolutions all include, as their only common feature, the dethronement of human arrogance from one pedestal after another of previous convictions about our centrality in the cosmos.”—Stephen Jay Gould, paleontologist, biologist, author (1941-2002)

“The conscience of the world is so guilty that it always assumes that people who investigate heresies must be heretics; just as if a doctor who studies leprosy must be a leper. Indeed, it is only recently that science has been allowed to study anything without reproach.”—Aleister Crowley, author (1875-1947)


Contact and copyright information

The Exchange is published on behalf of the Scientific Communication community of the Society for Technical Communication. Material in the Exchange can be reprinted without permission if credit is given to the author and a copy of the reprint is sent to the editor. Please send comments, letters, and articles to the editor.

Editor andpublisher:

Geoff Hart (ghart@videotron.ca)

Community manager:

Kathie Gorski (kgorski@execpc.com)

Webmaster:

C Joel Koeppen (cjoelk@earthlink.net)

© 2007, Society for Technical Communication (901 North Stuart St., Suite 904, Arlington, Virginia 22203-1822 U.S.A., 703-522-4114, 703-522-2075 fax, www.stc.org)