TechWhirl Sponsors

About TechWhirl

TechWhirl (TECHWR-L) is a resource for technical writing and technical communications professionals of all experience levels and in all industries to share their experiences and acquire information.

For two decades, technical communicators have turned to TechWhirl to ask and answer questions about the always-changing world of technical communications, such as tools, skills, career paths, methodologies, and emerging industries. The TechWhirl Archives and magazine, created for, by and about technical writers, offer a wealth of knowledge to everyone with an interest in any aspect of technical communications.

To a question about tracking documentation errors and
subject-software bug fixes, for inclusion in upcoming
releases of the docs, Janet Valade ventured:

> use the same process the
> developers use
> to track product bugs. Usually, it is some kind of database
> in which bugs
> are logged and tracked. If you monitor the bugs database, you
> will know the
> identified bugs and will know when/if bugs are fixed. You can
> use the same
> database to log "bugs" in your docs. And track the versions
> of the manual in
> which the "bugs" were fixed. This is helpful for technical
> support, as well.

Hmm. Here's my unpleasant experience.
For lack of a better way, the above is what I try to do,
but the nature of the beast gives me this problem:

The developers record *everything* in their database, so
there is no searchable/filterable distinction between
formal "bugs" in released software, and glitches,
suggestions, explorations, afterthoughts, etc. that go
along with the in-house development process. In other
words, many of the "bugs" are just growing pains that
will NEVER see the light of day in a release, but to
the developers working on them, they are just as serious
and are discussed in exactly the same terms. Both kinds,
the interim bugs and the "real" bugs are mixed in no
particular order and with no particular identifying
marks. For that matter, I'm just as likely to find
myself reading about problems, suggestions and fixes
for our in-house tools as about our products. They are
written and used by all the same engineers.

The result is a HUGE amount of reading, especially when
several software/firmware projects are in the works at
one time. My mailbox gets swamped with all the updates
that are generated each time somebody makes an entry in
the database. Why, it gets so bad sometimes that I can
hardly see the TECHWR-L posts in all the clutter. :-)

To make matters worse, one group uses the option to tack
all new comments/updates onto the bottom of the database
record for a given problem (if you start reading from the
top, you get the history and context and eventually arrive
at the current comments at the bottom of the record),
while another group prefers to have their entries appear
in reverse chronological order (if you already are
familiar with the problem -- from working on it or
managing it -- you see the new stuff without needing to
scroll past the history).

I've tried asking all developer groups to employ keywords
to flag the user-doc guy (me...) but that didn't work.
When I attempted to rely on the keywords for searching and
filtering, I missed some BIG stuff that should have been
documented.

It seems there must be a better way. By the way, until
we/they find a better system (and we're looking...) we
use Tower Concepts' Issue Weaver on RAZOR to track the
development and support issues within Engineering.

If somebody has some pre-packaged tips and tricks on a
web site somewhere, that would be "a good thing". How
do you folks mine the data that comes out of engineering
bug/issue tracking, without drowning in it, and without
missing anything significant to your end-user audience?

The single ray of sunshine is that I can track my OWN
documentation bugs by the simple expedient of using my
own private keywords in the Razor database. That keeps
me solidly on top of... oh.... say... about 2 percent
of the problem.