I like Roger’s observations that “exceptions are the rule in production”, exceptions require decisions, and “decision making is something humans excel at”. This reminds me of Jon Udell’s old motto that “human beings are the exception handlers for all automated workflows”. Software that doesn’t embrace this fact will stop the work from flowing.

“Workflow systems are rigid and don’t reflect the constantly changing realities of most businesses” – I can absolutely confirm this. So many times have we written code to automate a process, making the customer happy, until new business opportunities required changes and our code kept them from adapting quickly. Some larger customers make sure to have in-house DAM technicians who can code and configure without always having to go through us, the vendor.

Roger argues that DAM software should support power users, not enforce rigid workflow definitions. This is quite interesting: It obviously appeals to me as a developer (i.e., power user), but quite a few of our customers seem rather scared of power users. They’d rather have a “power administrator” and not leave much freedom to the regular user.

To me, another important point in Roger’s article is this: “Valuable integration begins with the user experience, with the frontend.” David Diamond demanded very much the same in Reinventing Digital Asset Management. I have written about frontend Web app interoperability before; it’s hard, but this is the problem we have to solve.

The rest of this post is a scenario that helps me think about what workflow support might mean in practice. It’s a bit lengthy, so feel free to stop reading here :-)

An example DAM workflow

This workflow is common to our newspaper-publishing customers:

A newspaper editor asks a photographer to take pictures of an event. The photographer sends the pictures to the newspaper, and one of those gets printed in the paper.

Workflows consist of processes. Let’s look at the processes involved: First, there’s a planning phase where editors decide which topics and events to cover. One of the available photographers needs to be assigned the task of taking the pictures. The photographer, after shooting, selects the best photos and adds descriptive text and metadata to them. Then she sends the photos to the newspaper. At the newspaper, the favorite picture is chosen, the image cropped and enhanced, placed on the page and sent to the printer. Someone in accounting makes sure the photographer gets paid. And finally, the pictures – with more metadata for better findability – are added to the newspaper’s image archives to allow reuse.

Note that there’s no mention of software so far: This workflow is decades old and doesn’t require any software at all, let alone “workflow engines”. Now let’s see how software can help.

Level 0: Do everything manually

Today, almost every task mentioned above involves software. But often, these tasks are performed in separate systems that don’t talk to each other: Editorial topic planning might happen in Trello, while photographer assignments are tracked in a Google calendar. Photos are sent to the newspaper via e-mail, then manually uploaded into a DAM system. For image retouching, the image is downloaded from the DAM, the retouched version re-uploaded, then manually exported to the editorial system where it gets placed on a newspaper page. The next day, a librarian searches the DAM for each picture that appears in the printed paper and manually adds metadata including the date of publication. Based on that metadata, someone else can search for published images and enter payment data into the accounting system so the photographer gets paid at the end of the month.

In this scenario, there’s many isolated software systems. Humans need to know where to look for information, and which data to manually copy between systems.

Level 1: Automation

Pretty soon, people will want to automate parts of these processes: The photographer assignment notification should contain a DAM upload link so that images go directly to the DAM, with automatically-added assignment metadata. The editorial system should tell the DAM which images have been printed in the newspaper, so that the DAM can add publication metadata automatically, move the image into the long-term image archive, and send payment data to the accounting system.

A lot of time can be saved automating processes. But of course there’s drawbacks, too: Automation implies the assumption that we always want the same things to happen. It means taking human oversight and decisions out of the loop.

What can go wrong? The photographer might send pictures not related to the current assignment, so the automatically-attached metadata is wrong. Not every published image may be copied into the archives for reuse. And humans would know that they can skip the logo that’s in the paper every single day, even though it technically is a published picture.

This kind of automation is usually neither visible to end users, nor can they stop it from happening. Changing automated processes – to better align them with always-changing business processes – involves technicians, whether processes are “hardcoded” in software or configurable.

And of course, no software works perfectly all of the time. When an automated process fails, you have a whole new class of problems: possible data loss, follow-up processes that already have run with incomplete input data, the difficulty of manually working around the problem while it persists, and cleaning up afterwards.

Level 2: Workflow awareness

Even with automation, the software has no notion of the overarching workflow. It isn’t aware of the context, so it cannot present the context to users. Human communication revolving around the workflow needs to happen “out of band”, i.e. via phone and e-mail, outside of the systems the information is living in.

Which information can get lost in our example workflow? Well, the photographer might want to communicate something to the Photoshop guy (“make sure to blacken out the license plate”), or to the accounting staff (“the editor agreed on paying twice the standard fee for this assignment”). She might have to alert the paper that these pictures must not be reused, an exception from the rule that all published pictures are marked as “available for reuse”. To make this last one more difficult, let’s assume that she gets aware of this after she sent the pictures to the newspaper, but before the automated archiving process has run (so she cannot add this information to the image metadata, and the archivist she might call doesn’t yet see the image in the archive).

Each of the persons (and automated processes) involved may have information to add, or questions to ask, or decisions to make that affect other (possibly automated) processes.

A fully workflow-aware DAM system would treat a workflow instance as an asset-related entity with its own metadata. Each asset used within a workflow would display its own routing sheet that shows what kind of workflow this is, which additional information has been attached to it, what happened so far and what’s to happen next (manually or automatically). With appropriate permissions, the user could modify that sheet to add information, change what happens next, or move the asset out of this workflow instance.

The routing sheet with its workflow and process data would probably live in a separate system because real-life workflows cross system boundaries. And the same sheet would appear in all systems involved in the workflow; the Photoshop guy would see it in Photoshop, the photographer in her photo upload app, the accountant in SAP.

Can we do this? Does something like this already exist? Or would it be overkill and we should indeed refrain from workflow features and just let the power users define their own automation?