Perfecting the Review Process: Chefs, Tech Writers, and Questions from the Test Kitchen

Many years ago, I heard my colleague (and founder of Excosoft) Jan Christian Herlitz say that reusable content is like food. Each reusable component is an ingredient in the kitchen, and with a recipe we can combine ingredients into a dish—a complete document.

Throw bacon together with sausages and eggs and you have an English breakfast. Or wrap it around a bundle of asparagus for a completely different culinary experience. Same bacon, different dish. Just like reusing a safety procedure in a technical manual. Actually, why not slice up the bacon and use it in the place of basil in this Caprese salad to make things a bit more interesting? And here comes Gordon Ramsey, ever on the lookout for chefs who have lost their footing and indulge in their favorite unsavory food fantasies, knocks me on the head with a spoon and throws the salad in the bin. We've all seen Kitchen Nightmares, right?

What this often illustrates is the need for a review process. In this case, in the form of a head chef who makes sure that each dish does the job it is meant to do. Even when every ingredient is in impeccable shape, and cooked to perfection, a dish can still fail.

The same goes for content in Skribenta. Technical writers are like chefs who keep their metaphorical kitchens in spotless order, produce reusable content components, then combine them to form complete documents ready for delivery to their audiences. At their assistance, they have several advanced tools for reusing and customizing content—such as variables, conditions, integrations with product information databases, etc. These components are their domain, and they use their specialized skills to maintain them. This is the equivalent of preparing an ingredient to make it work with different kinds of dishes (such as boiled or fried potatoes).

Often, someone else is responsible for approving the documents. Perhaps it is a product owner, manager or subject matter expert. They should not be forced to learn all the details of how the documents were produced and what results that will generate. What they need to review is the complete document, not the reusable components.

So, we built an HTML representation of the document and invited them to review it there. That way the comments are kept in the production system and perhaps we can even feed the comments back to the source component that contains the commented text, so the technical writers have easy access to them. Piece of cake.

Except for one thing: thanks to reuse and content customization, technical writers can produce vast amounts of documentation with little effort. After all, that is one of the reasons people use Skribenta in the first place. How can the reviewers keep up with that pace, if they must review the entire document every time? In many cases, most, or even all, of the content will have already been reviewed inside another document. It seems like they get the short end of the stick.

To help the reviewers, we should make it possible to reuse the results of reviews. And that reuse must happen in segments of the document, rather than the whole. For instance, if an instruction in a document has already been approved in a previous edition of the document, or in another document entirely, and it is still identical to the approved instruction, the reviewer would not have to spend time reviewing it again.

The reason I’m writing about all of this is because we are currently working on the review process in Skribenta. More specifically, we arebuilding a review app that will allow the entire document to be reviewed, while also enabling the review results to be reused. Of course, this forces us to think about how all of this is going to work out, and what it really means to reuse review results. We do not have all the answers yet, and we’d love to hear insights from others in the field. Here are some of the questions we’ve been asking ourselves:

First, what is the thing that should be reused? In traditional word processing tools, review functionality is often centered around reviewer comments. But we currently believe that for reusability, another concept is needed: content approval.

A comment usually means that something is amiss. Content must be changed and reviewed again. It can certainly be relevant to see that a certain piece of reusable content has been commented on in another document, but it does not help reviewers avoid reviewing the same unchanged content over and over again in different documents. Content approval should be explicit, with the meaning that the content is correct and fulfills the requirements in the current context.

Secondly, how much content can be approved at once? How small or large should the content chunks be? Since the content model in Skribenta (like other, similar XML based content production solutions) is organized in a hierarchical tree structure, we could allow approval of any node in that tree. For instance:

document

section

title

list

listitem

paragraph

#text

Listitem

paragraph

#text

Surely, it is not reasonable to approve anything smaller than a paragraph. In fact, that is where we are planning to draw the line. That means it would be possible to approve things like the whole document, a chapter, a table, a list, or a paragraph. Make sense?

A related question: if an entire chapter is approved, should that content be automatically approved when it appears identically in another document? In other words, can the approval of a chapter be "inherited" by the paragraph, and appear as pre-approved in another document?

The most difficult question though, which may have already occurred to the reader, is: how is the approval affected when the approved content is reused in another context? How far can we assume that a piece of content that is correct in one context, is also correct in another? The short answer is, "it depends." This would seem to be something that, as tool vendors, we cannot decide. It requires some understanding of the content and the different contexts it appears in—and that will be different for each case.

So, we need to reuse approvals, but we cannot come up with a rule that determines when reusability can be applied and when it would risk reducing quality by tricking the reviewer into believing something is approved when, in fact, that approval is not applicable in the reviewed document. A detailed assembly instruction can be perfect for one product, but for another it might describe how to assemble parts that do not even exist.

The best answer to this conundrum we have come up with is to leave it up to the users. When a reviewer comes up against some content that has been approved somewhere else, that previous approval should be clearly indicated, together with relevant information about when that approval took place, who made it, and in what context it was made. That way, the reviewer can decide whether it is still good enough or the content needs to be reviewed again. Perhaps there could be a filter for the entire review job, where it would be possible to select which previous reviews should be displayed and which should be discarded right from the outset.

This is still a work in progress and we are still considering what will and will not work. As said, we really would appreciate any useful feedback on these ideas. So if you know how reusable content should be reviewed in an efficient way, or have any other reflection to share, do not hesitate to write to us in the comments. Or, if you know a good recipe for Caprese salad with bacon… we would like to hear about that too.

About the author

Joakim Ström

With over 15 years dedicated to software development, Joakim expertly drives internal improvements and often hands-on innovation here at Excosoft.

Post Comment

Your Name

Your E-Mail

Comments

Comments

Jennifer11 months ago

We have more than one review process. One review comes from the SMEs to make sure the document content is accurate for the product. Then we have an editor (me) review the document for grammar, organization, clarity and adherence to company style guidelines. We would be interested in being able to flag content as approved by different types of reviewers.