All kidding aside, this joke captures the true essence of design thinking. It sees that the original question contains an assumption that the light bulb is the problem — but the real issue is that the room is dark! What if we instead ask:

How can we light this dark place?

By taking a step back to reframe the original question in a larger context, we give ourselves the opportunity to solve the problem in a new way, to find methods that don’t rely on light bulbs, fixtures, or even electricity. It may turn out in the end there is no other better option — but unless we keep asking questions, we risk seriously limiting our chance to discover innovative solutions.

The process of innovating is what physicists call “infinitely sensitive to initial conditions;” that is, it is shaped and defined by the definition of the problem at the onset. And just as the gentle breeze from a butterfly’s wings can develop into a hurricane, the definition of the challenge at the start of a design thinking journey has a huge impact on the outcome.

There are many factors that affect the outcome of the creative process, but asking questions is probably most critical to the success of any exploration and venture. That’s why practicing design thinking is so valuable.

What is design thinking anyway?

Design thinking fuels innovation efforts in almost every industry around the world, but its definition can seem somewhat elusive.

Design thinking is a creative process that uses tools from the world of design — asking questions, observation, brainstorming, prototyping, and more — to gather inspiration, build empathy, find real needs, and create actionable solutions.

The light bulb joke we started with contains a fundamental truth about design thinking: it asks us to fall in love with the challenge before we start to answer it. It tries to ensure that we question the parameters, identify our assumptions, and pick the right problem to solve. And it continues to ask questions at every step we make towards a solution.

This can be uncomfortable for people used to the typical business method of spotting and solving problems rapidly — but it’s necessary. Innovation rarely happens from choosing the swiftest or most obvious solutions.

That said, design thinking complements but does not replace the valuable work done by marketers, business analysts, and trained design professionals. What it uniquely brings to the development process is a focus on — and sensitivity to — the personal experience of real people. How do they see my product or service? What images and mental models do my instructions conjure up? How might we design an experience for them that is useful, friendly, and delightful?

Inspiration for innovation comes by gaining understanding and empathy for people’s experiences, and we start by taking that mental step back to see the bigger picture.

What kind of work is design thinking good for?

Despite its name, design thinking isn’t just for designers in a studio.

Design thinkers in all roles and industries have worked on an endless array of challenges: the new-employee experience, hospital procedures, restaurant flow, city district redevelopment, school systems, water in the developing world, health insurance application, and of course new products, services, and digital experiences of all kinds.

Content creators — whether in marketing, gaming, social media, music, or other endeavors — who practice it in the development of their work can better understand their audience’s perspective. This empathy allows them to create content that speaks directly to a reader’s needs or desires, which can result in a boost of engagement, adoption, purchases, downloads, and so on.

Enlightened organizations also use the process’s method of thinking and acting to transform the way they engage individuals and teams, and change the way they deliver value to the world around them. That’s because design thinking builds creative collaboration in users by teaching them how to think differently, develop teamwork skills, and express their unique point of view in a way that amplifies their potential to arrive at an unexpected, unique, solution.

This mindset can be available to everyone in an organization, not just those with certain backgrounds. Collaborators at every level can practice design thinking if they are given the support and resources necessary to apply it. It starts by continually asking questions, keeping an open mind, and truly listening to people’s answers.

Great ideas can come from anyone. So, the next time you are confronted with a problem that has an immediate, obvious answer, make sure to pause for a minute and ask yourself: does it have to be a light bulb?

The image at the top is from Liter of Light, a global project of the MyShelter Foundation that cleverly repurposes empty soda bottles to provide solar light to communities with limited or no access to electricity: http://literoflight.org.

Chapter 1. What Is a Global Content Strategy?

Content. Our world revolves around content. These days, buying decisions are often based on experiences not with products, but with information about products. People consume more content in more ways than ever. We have printed books, newspapers, and magazines. We have e-readers, smartphones, and tablets. We have TV, radio, YouTube, Instagram, Pinterest, and Hulu. We consume more content in more ways than ever before.

No one can dispute the increasingly important role that content plays in our lives, our work, and just about everything we do.

Naturally, with the growing importance of content, a lot of attention is being paid to content strategy. This is a good thing. Companies need to stop throwing content out to the world without a good reason. They need to manage content strategically to contain their expenses, control brand, avoid confusion, improve search and findability, and more.

But what about global content? What about all the content that your company produces for people in other parts of the world? Content professionals who focus on in-country strategy, failing to think strategically about millions of words, images, and media that are destined for other languages and locales, do so at their peril.

According to Common Sense Advisory, in 2014, the translation industry (that includes both tools and services) was US$37 billion and growing over six percent per year. They predict that by 2018 the language-services market will increase to US$47 billion.

Companies that spend big bucks on translation need to spend time, energy, and money creating strategies to manage all that content.

Definition

Let’s start with a definition of global content strategy:

A global content strategy is a plan for managing content that is intended for people whose main language is something other than the source language.

Components of a global content strategy

A global content strategy can be broken into three parts.

Part I – Understanding where you are and where you want to be

What are your goals for managing global content?

How is your global content currently created and managed?

How do you currently create and translate content?

What content do you currently have?

Part II – Analyzing the gap

How far off is your current situation from your goals?

How far off is your current situation from industry best practices?

Part III – Moving ahead

What do you need to do to narrow the gap?

What tools and infrastructure changes do you need?

How can you improve the quality of your source content before it is translated?

What changes do you need to make to your workflow to support your global content strategy?

Why you should care

Here’s why should you care about having a global content strategy:

You care about the money you spend translating content.

You care about the quality of your content in all languages.

You care about the time it takes to localize and translate content.

Someone told you that you’d better figure out this mess (probably because of reasons 1–3).

Global content strategy is a large topic. Think of it as putting the topics of structured authoring, single sourcing, and web-content management into a global blender. Puree on high, then add those cumbersome tasks of tracking the number of languages, the number of translation vendors, the content created in other countries, and more. Garnish with a nice chunk of pineapple.

You need a global content strategy if…

“But wait!” you say. “I don’t really need to care about my global content, do I? Isn’t that someone else’s problem? Isn’t that the job of the localization team (whatever it is those people do in that other building on that other campus)? Why would anyone other than the localization and translation people need to care? I’ve got enough content to worry about.”

Well, my friend, I’m here to tell you that if you care about content for customers who are located in your country, then you must care about content for your foreign customers, too.

You need a global content strategy in any of these cases:

You translate content into four or more languages. In my experience, after you hit four languages, you’d better start managing your global content strategically. Many companies chose to implement a global content strategy with three languages. If you are in a situation where the number of languages will grow, it is never too early to plan to manage the growth.

Multiple groups in your company translate content independently. Often, multiple groups translate content with no coordination or even awareness between the groups. I’ve seen product groups release datasheets or web pages in multiple languages without anyone in marketing ever participating in a review. International offices are creating content in their native languages, and no one at corporate ever sees it or knows about it. I’ve seen sales teams located in Asia, for example, create their own materials without informing the localization team in the United States. It’s common – and often beneficial – for regions to create unique content in their native languages. After all, who knows a market better than the people who live there? Still, it’s important for you to know where all your content is, who is managing it, how many languages it is in, and where it lives.

You don’t know the name of the head of localization. If you are responsible for content – any content – that is being translated, you must have a direct line to your localization team. Siloed efforts at translation and localization might work for a while, but eventually, they fail.

You have so many translation vendors that you can’t remember which ones you sent what content to. Many large companies work with more than one translation vendor. That’s a common practice. Unfortunately, I’ve seen companies work with so many translation vendors that they can’t keep track of who is translating what. This lack of coordination results in some content being translated multiple times and some content never making it to translation. In some cases, the lack of coordination can even delay a product launch.

You don’t know what TM stands for. TM stands for translation memory. If your content is being translated, you need to understand what TM is and how it works. You also need to manage your TMs. Having multiple vendors, each using its own version of your TM can be ridiculously expensive and can create mismatched translations and overall confusion. Know where your TMs are, and keep them to a minimum. In an ideal world, you have a single TM that is used by all your translation vendors.

You are considering using machine translation. There are different kinds of machine translation: statistical, rule-based, and hybrid. Machine translation (MT) systems can be complicated and expensive. To do it right, you need to use best-of-breed MT software and have experts help configure the software for your content.

You have tried using Google Translate for real work. Many companies think that using free machine translation software is good enough. It is not.

Even if you have only one of the factors listed above, it is important to strategically manage all the moving parts in the global content workflow.

Google Translate is not a global content strategy

We would all love to believe that free MT suffices as a translation strategy. In fact, some companies use free translation engines, like Google Translate or Bing Translate, for real work. I’m here to tell you what you already know: you cannot rely on free MT engines if you want to be certain that your translations are accurate.

Free MT is fine if you’re translating a letter that your great aunt sent you from Italy. It’s also fine when you want to tell a new friend “Welcome to the United States” in his or her native language. But free MT is unacceptable in any setting where you depend on the quality of the translation. This includes all business communication more significant than an email to your colleague in Uruguay telling him that you look forward to seeing him soon. If you care about your brand, if you care about your customer, if you care about your job, hire a professional translator for content that matters.

Will there come a day when free MT is as good as professional translators or even specially programmed MT? Perhaps. But that day is far off.

Content strategy versus global content strategy

You would think that any thorough content strategy would include all of the concerns that are important for global content. Unfortunately, this is not the case. These days, there are several types of content strategy. Examples:

Web content strategy, which focuses on content delivered via the web

Technical content strategy, which usually includes structured authoring and content reuse

Global content strategy, which includes all content, everywhere, in every language

I think it is time we stop segmenting content strategy into global and non-global. If all companies included global content from the beginning of the planning phases, and if all content developers planned for translation as they created the source content, companies would avoid many problems – and we wouldn’t need the term global content strategy. Ideally, the term content strategy would cover global considerations implicitly.

Until we get there, we need to put the global in content strategy. And we need books like this to show how it’s done.

See the conference program and workshops Join us in Boston to learn how marketers and IT, business, and content managers across industries integrate content strategies and computing technologies to produce superior customer experiences for all stakeholders, including: How to architect and build digital experiences around your customer’s journey Successful examples of multichannel content architectures for B2B […]

Helping Writers Succeed with Great DITA, Minimalism, and Agile Content

Too often, when we work on a DITA implementation we spend a lot of time on the bells and whistles of our CMS or the DITA-OT publishing scripts. But, often the content is bloated, uses unsemantic markup, is riddled with duplication, and suffers from condition overload. If this is the case, you won’t get the benefits. Bottom line: DITA, minimalism, and agile development are powered by robust, semantically correct DITA content.

It’s not just that you won’t be getting all the benefits. Having messy content sets up your writers for failure and frustration.

The meals are pretty poor but then they never use fresh ingredients – garbage in, garbage out.

Are technical writers the best agents of change for DITA, minimalism, and agile content?

How much can you rely on writers/editors deeply embedded in the product teams to be the blunt destroyers of duplication that we so much need?

In general (and there are always exceptions), the greater the portion of the remove/repair/prune burden that we impose on writers/editors, the lower will be the benefits.

In other words:

? Reliance on embedded writers ? ? RERP benefits:

? Slower migration

? Ability to support reuse

? Frustration and conflict with product teams

? Frustration for writers and editors

? Translation costs

?Turnaround time

Why tech writers embedded in the product team may not be the best DITA change agents

The more that we rely on embedded writers to get great DITA, minimalism, and agile content, the more disappointed we will be in the results. This is true for a few reasons, including:

We are humans, and we have relationships. When we RERP we will be brutally pruning the content of our colleagues, people that want to like us. Each degree that the RERP team is removed from the content owners, the easier it will be to be brutally effective.

Deconstructing someone else’s work is much easier than deconstructing your own. (This is why publishers have editors!)

Embedded writers will need time to scale up to the required new skill sets (minimalism, conref’ing, applying conditions within reused items where possible, creating reusable chunks in content banks, applying semantic markup, and more). The greater the share of RERP on embedded writers, the smaller will be the benefit and the longer the delays. (When a dedicated team handles RERP, each team does not have to repeat rookie mistakes. Roughly translated folk expression: you don’t have to learn how to shave on your own cheeks.)

Change is hard. Big change is harder. There are lots of talented writers on your team. Some of them will get with the program out of the gate. Most of them will require time. By reducing what we ask of individual writers, we can get them to the desired skill levels and new mindset faster.

So, what is the alternative?

For RERP, here is a proposal: Freeze ? RERP ? Thaw ? Enhance

Selected Content Freeze – Content is selected for RERP, it is frozen. (For example, from Thursday to Monday – or any two or three work day period.) During the freeze, no changes may be made to the content by writers.

Selected Content RERP –dedicated RERP team. During this period, they may ask questions of the content owners.

Writers create product content. Lots of it. Content of all types: sales, marketing, support, technical, and training. Because of the way companies are organized, writers often create product content in isolation from one another. They seldom collaborate, even though it makes perfect sense for them to do so.

Typically, the way companies organize and run content-creating departments is to blame. They build borders between content teams that introduce expensive and preventable challenges, many of which have an adverse impact on how prospects and customers feel about a brand. They give people who work on content teams different titles and allow them to shape their own rules, select their own tools, and create their own processes.

Because there’s little knowledge-sharing—and even less collaboration—between content-producing departments, most big brands end up creating giant content hairballs that spawn more problems than they solve.

Prospects and customers should not have to work to decipher what a brand is trying to say

It’s the job of management to eliminate the causes of schizophrenic product messaging, confusing jargon, and dramatically incongruent product content experiences. Customers shouldn’t have to work to decipher content incongruities or be forced to navigate chaotic product content collections masquerading as quality consumer information.

The truth is prospects and existing customers don’t have to schlepp through your pre-defined purchasing process or get sucked into your fanciful sales and customer loyalty funnels. They are in control. You are not. It’s your job to listen, to remember, and to respond. How well you chose to do that will differentiate your brand from the competition.

Source: “The consumer decision journey”, McKinsey&Company

Technical product content is a business asset; leverage it wisely

Product content is a valuable business asset. High quality, consistent technical product content acts an attraction factor. It’s also a confidence builder and a powerful sales engine. As such, it’s an asset worthy of being managed efficiently and effectively.

Consumers rely on technical product content to help them make informed purchasing decisions. They rely on it to be what they need it to be, when and where they need it, throughout their relationship with your brand. At every turn. Every time. On every channel.

Poor technical product content experiences are costly

Second-rate technical product content experiences are unfortunately the norm. This sad reality really turns people off. Increasingly, consumers express their unwillingness to hurdle discombobulated content collections in order to solve seemingly simple challenges. They do so by abandoning shopping carts, voicing their concerns in public digital forums, and returning products for a refund.

No brand wants returns. Hard costs associated with returns can be significant. Returns are revenue-draining; money spent focusing on sales-generating activities must be refocused on processing returns for unsatisfied customers. No mater how you slice it, there’s nothing good about that.

Other industry sectors have yet to grasp the importance of providing technical product content to help drive purchasing. In fact, until recently, technical content—product specs, assembly, maintenance, repair, and support content—was thought to be useful to consumers only after a sale.

Brands don’t control the customer journey—customers do

There are many reasons why prospective buyers of consumer and business products want to understand their entire experience with a brand before the buy. Shoppers—both B2B and B2C—have been burnt before. They’ve learned lessons the hard away. As a result, they often want to compare the technical product content provided by brands they know against brands they are considering. Support, training, and documentation content experiences often weigh heavily in purchase decision-making. Prospects seek confidence that they are making the best decision for themselves by examining all the information they believe they need beforehand.

But most brands mistakenly believe that a sales lifecycle is a customer journey. They spend considerable time and effort mapping out imaginary journeys designed for prospects to follow. These hypothetical adventures are baked up by well-intentioned, but misguided, content professionals who apparently believe that if they create a linear journey (or better yet, a sales funnel) that prospects will line up to stop at each touch point and smell the content. Nothing could be further from the truth. Customers don’t follow journey maps.

Creating exceptional technical product content experiences must be more than lip service

No brand wants products to be returned. No brand wants bad product reviews. No brands want to invest in creating sales pipelines that turn paying customers away. And, no brand wants to waste money producing content that fails to perform.

That’s what most brands say, at least. It’s hard to see evidence that their words are little more than lip service. Actions speak louder than words. Savvy-consumers have little tolerance for empty promises.

To be clear, customers expect more today than in the past. They know they deserve better. They know they have choices. You don’t have to be one of them. If your brand isn’t willing to serve their needs, chances are another company will.

In our 2017 DITA Satisfaction survey, we learned that many companies that have adopted the Darwin Information Typing Architecture (DITA) aren’t 100% satisfied with their DITA implementation. In fact, 62% of respondents report having problems—some major, some minor.

Common DITA Challenges

We asked survey respondents to rank their biggest DITA challenges.

Here are the top ten:

Lack of a proper content reuse strategy

Making DITA do what we need it to

Lack of training designed to equip staff with knowledge/skills needed

Understanding about how software works

Developing style sheets and other rules for content

Enforcing the rules for content

Finding skilled and experienced employees

Lack of understanding about how to create DITA content

What to measure and how to measure it

Translation challenges introduced by DITA content

The Benefits of DITA

While the primary focus of the survey was to identify the level of satisfaction amongst organizations that have adopted DITA, it also uncovered the most often cited benefits. Not surprisingly, improvements in content consistency and content usability we the most common benefits reported by the 250 organizations that responded to the survey.

While translation savings was lower on the list of benefits than anticipated, that’s likely due to the fact that they survey did not segregate the respondents by the number of languages produced. It’s likely that amongst those companies that produce multilingual content using DITA, translation savings rank higher.

Overcoming DITA satisfaction challenges

If your organization uses DITA (or is planning to in the future) check out this useful and actionable series of articles from Rob Hanna, a DITA expert and a long-time member of the technical committee that manages the DITA standard. Rob has parsed the survey data and shared his views on how you might tackle some of the biggest challenges facing DITA users.

Our 2017 DITA Satisfaction Survey Report provides a high-level understanding of the top challenges facing technical communication teams that have adopted DITA. If you have not received a copy of the survey summary report (aggregate results for technical communication professionals working at 250 companies in 25 countries around the world), you can download your copy here.

“Surely I’m overthinking this,” I kept saying to myself, impatiently waiting for a solution to appear before my eyes. I learned that it’s tough to create versioned content with a static site generator, while making sure that the source control system does the version tracking and scripts do the build automation and your theme doesn’t lose its mind.

I needed to find the best way to move our current non-versioned content in a Jekyll site to give us output that showed users versioned sites. This is an age-old problem in the technical writing world, solved over and over again. I couldn’t quite get there on my own. Eventually I consulted with my former co-worker Carolyn Van Slyck, to help me talk through solutions. She was able to show me how to copy source files at build time without checking them into Git. She also provided a valuable sounding board for “am I the crazy one?” Much gratitude to Carolyn for the help with my test repository at versions-jekyll!

Complexities

I found that there are a couple of key places where the complexity arises, when using a static site generator. The first layer is within the source. When using source control system, I shouldn’t need multiple copies of content files in the repo itself. The second layer is in the output. When using static sites, I should build to a specific folder name, with the version value, each time a release occurs.

Other complexities may arise as you analyze, such as needing versioned translated content, or making sure that only certain users have access to certain releases or must login before accessing the site.

In the output I’ll still need to make sure my links work, my CSS and JS files are properly linked. I shouldn’t have to maintain copies of actual output files in my source repo because I do not want to figure out merge conflicts in HTML.

So with those general guidelines in mind, I set out. I wanted to see if what we do in OpenStack with versions using Sphinx, and what Read The Docs does with versions using Sphinx, could translate to versions and Jekyll. What I learned is that it can, but not exactly in the way I was thinking about it.

Setting a version value in the source

With Jekyll, you can get access to values by setting them in a _config.yml file, which is used to build the site each time. You can also set values in a specific config file used only for that version, and build with both the “base” _config.yml file and the versioned config file. You can also have a theme that uses a metadata value called “permalink” that sets the output folders that make up the URL for the output file. All these configuration options could be used to set the version value. Testing each option took some time.

Outputting version values into directories

With a Jekyll theme like Minimal Mistakes, you cannot build directly to the gh-pages branch, because it uses gems that are not supported by GitHub Pages. Instead, the build script in the versions-jekyll repo builds locally and copies the files to the gh-pages branch for publishing. So, the script needs to build the versions available to it locally. Right now, a “for” loop in bash does this work, so the script reads in version values.

An approach I learned was that you could pass in more than one _config.yml file at build time. With the correct configuration, Jekyll builds all the relative links to CSS and JS files needed for the theme. I did have several failed attempts while I played with permalinks, baseurl, and so on. I found this post, Configuring Jekyll for User and Project GitHub Pages with reference tables super helpful.

As versions are added, you’d need to update the script with the values, or you could read in a versions list from a file. Here’s an example for three version values to be outputted into three folders, example.com/4.1/, example.com/4.2/, and example.com/4.3/:

for v in 4.1 4.2 4.3
do
# Checkout a versioned branch with the version name
git checkout $v
# Create a configuration file that sets a new baseurl based on version
echo "baseurl : /$v" > _config.$v.yml
# Build using both the basic configuration file and the version config file
bundle exec jekyll build --config _config.yml,_config.$v.yml -d _site/$v/
done

Explanations

The version control of source is done with git through stable branches or tags. You can debate a bit about which is “better” but I believe stable branches make more sense than tags so that you can later backport any fixes to a branch sensibly.

The versions of the output are not semantically meaningful, and can have silly names if needed. Output versions can also skip a release, so the version value is treated like a string and not an integer.

The settings in the _config.yml sets the destination folder at build time for the versions. I’ve tested using both “collection” settings and “baseurl” settings and found “baseurl” to be the simplest for our theme. Your experience and theme settings may push you one way or another.

The output goes to GitHub Pages, in separate folders per release, so that the URL also reflects the version number you read on a given web page. In our case, we build locally, then push to the gh-pages branch from that local build.

The master branch always reflects the current, or latest, version of the docs site. I’ve chosen to use /latest/ in the output URL but you may have reasons for naming it in another way. For example, you may not want to always publish master to a public site, and instead need to publish from a stable version only.

Minimal Mistakes is the theme in use, for us, which supports collections. The permalinks are encoded in the markdown files themselves, but those are not in use when versioning docs when using the baseurl approach, so the collections remain intact and unchanged for our solution.

Design decisions

On the front-end, you’d still need to design for scenarios such as “What if the page I want to access on version 4.1.1 does not exist in version 4.1.1? Do you give a 404 error or a special “not found” page when navigating versions? What about an archived version, can someone still find it and browse it?

Also, if your product provides n.n.n release versions, but really the docs site only changes drastically at n.n release versions, you could version the source files at n.n.n but only output content at n.n releases. This design decision would make the docs simpler to navigate, but may cause confusion if a user looks for the site versioned at n.n.n.

I’ve also not discussed the problem of search scope here. The design decision here is whether to enable searching across all releases? Or scope the search to only one release at a time?

Testing

Testing different solutions took weeks of time. I wanted to have confidence in the maintainability and design of our solution. I also had several weeks of diving into complexity before resurfacing and seeing the baseurl solution was in front of me the whole time.

So when I went to try out the “fake” install guide in the versions-jekyll repo, I built the site, went to the site url, and entered a version number and install-guide/ in the browser URL. But, there’s no such file as an index.html file in /install-guide/ because the only files are named introduction.md and deployment.md. So I add an index.md, but in fact, Jekyll won’t give me a URL like site.com/current/install-guide/index.html, even when I put it in the permalink metadata to be /install-guide/ in an /install-guide/index.md file. Why not? I guess it’s because Jekyll wants to ensure the permalink settings are for the same depth across the site? This confusion caused me to look for an even simpler solution.

Summary

For our Jekyll docs, changing the baseurl for the entire site as a version value became the simplest solution. I investigated collections for versions quite deeply, but wasn’t happy with needing to change so many permalinks each build, and the risk of broken links seemed higher. Fortunately, our layers of complexity are minimized, in that we only have a few deliverables, our site’s content is not translated, and we do not need to require logins on the content.

Blockchain. It’s popping up in news stories, magazine articles, blog posts and presentations. But what is it, exactly? And, why should we care?

Our friends at CommonCraft explain the basic idea behind blockchain in the video we share with you here. To learn more, stop by Blockgeeks to discover how blockchain came to be, where it’s in use today, and where it’s heading tomorrow. Blockchain is a great example of innovative disruption. The story behind it and its creator(s) makes for some interesting reading.

The Current State of Author Experience

The idea of author experience is often dismissed by people with budgets and those who design and build CMSes. They see more important places to invest, better opportunities for quick-win returns. Author experience demands a long view; it suffers in the face of short-term investment attrition. We must, therefore, demonstrate the loss, and the risks, incurred by ignoring this opportunity.

Messy system, messy content

The purpose of a CMS is to facilitate the communications process for authors.

Author experience is the discipline of implementing the part of a CMS that enables it to fulfill its purpose.

It all sounds straightforward. Figure out what authors need to do their jobs better and accomplish their tasks with the least pain and disruption, then implement that. Right?

Yeah, right!

The state of the CMmess

Considering only those who classify themselves as CMS providers, there are hundreds of players. Some offer enterprise implementations with significant license and consulting fees. Some offer usage-specific, open-source platforms that anyone can set up in a matter of minutes. Most of the rest offer something in between. Then there are players who offer custom implementations, often derived from existing platforms.

Commercial CMS vendors win contracts by offering a set of features that looks good in a presentation deck and impresses the client’s marketing, IT, and purchasing departments. It doesn’t matter how well those features work because vendors know that clients will want the system customized to their needs; anything out-of-the-box will be modified.

A features list, without integration

A couple of years ago, while solving some issues for a client, I ran into the following problem.

The platform, for which they had paid seven figures, provided an element that let you associate a block of text and an image. It also had a localization model that enabled content structures and elements to stay synchronized.

Great, you might think. We need both features. Except, when you edited a localized version of the text, the content synchronization was lost at the level of the entire element! Change the text, and the image was no longer synchronized with the master content instance. Any change to the image now needed to be replicated to 26 locales.

This platform ticked both feature boxes, but in practice, it failed completely.

Open source platforms are built by people who are technologically competent; they understand the workings of their systems. And because they are familiar with the technology and storage/retrieval paradigms, they do not find it onerous to manage their content using those paradigms; the model makes sense to them. However, the non-technical author is left out in the cold.

Custom implementations fare somewhat better. Because they are usually built for one client’s specific task, they stand a decent chance of doing that task well. But rarely do any improvements make it out into the wider world; the value is not shared.

Content management, generally, is a mess. Almost no attention is paid to the people who use content management systems. They are left to decipher interfaces and paradigms that bear no relation to the way they think about the content they manage. How the system presents content is sexier than how well it enables authors to manage content; the end-user experience is given more weight than the author experience.

Exposing the value

These problems engender frustration and drain resources. If you are trying to solve a content problem and the CMS forces you to think in a way that does not make sense to you, you cannot be productive. Frustration results in a lack of attention. If something that should be handled in a single process requires multiple steps, linked together manually, you make mistakes.

Mistakes in communication result in lost revenue and excess costs.

Improving author experience – enabling the CMS to fulfill its purpose – provides a direct business benefit in communication quality and integrity. Whether you can free up individuals and resources is secondary.

In a world where reputation constantly hangs by a thread, subject to social-media mockery, poor content management can so seriously damage an organization’s reputation as to put it out of business.

Professional tools for professional communicators

Content management is about communication. And communication is the lifeblood of any business. Communication sells our products and services and supports our customers. Communication is arguably the most important function in any business.

Content management is a high-impact service provision within any organization. So, it makes sense that the people carrying out this function must be professional communicators. And professionals, whatever their expertise, need tools suited to their trade.

Purchasing a platform

Content management tools have their own models and paradigms, which are not particularly suited to the mindset of the people who must use them. (See Section 4.3, “The technologists’ paradigms” for details.) This is in direct conflict with the purpose of a CMS, which is to facilitate a human process.

Technology purchasing process

In the old days, any type of content management was considered to be an IT responsibility. After all, the system revolved around a big, scary computer – definitely part of the IT world.

And the IT department made its own rules, in effect dictating what the rest of the business could or could not do. If you wanted a system, you informed the keepers of all things technical, and they took ownership. Unfortunately, the best-of-breed solution they gave you met criteria that were not necessarily based on the functionality you needed. The IT department was buying a new toy that suited its rules without asking what the new toy should do – not that the business really understood what it wanted anyway. Most often, this disconnect happened because authors were not consulted on their needs, and if they were consulted, there was no content strategy to speak of.

As technology has become ubiquitous, more integrated into everyday life, other departments have gotten in on the act, making purchasing decisions for themselves and moving IT to a supporting role. This has, to some extent, been abetted by information systems becoming available as cloud-based services.

But one feature from the old model remains. Even when the right departments are buying, they often do not understand the full scope of the functionality they need before making the selection; they buy based on a checklist of features, too often based on the latest social buzz. They select the platform before anyone considers what it needs to achieve. Then, the platform’s capabilities and limitations drive the customization. The business is forced to change its practices and processes to comply with the technology, which cannot be good.

We chose it because…

An organization I know is being forced by regulators to sell off part of its business. As part of the sale, they are required to set up systems for the spin-off business, including the marketing website. They have already selected the platform. When I asked them about the selection, they admitted that they had not analyzed their communication requirements, nor had they considered the actual business needs or the platform’s compatibility with them.

As far as I can tell, the two most likely reasons they selected this platform are unrelated to their requirements or business needs:

They are already using this platform for another system.

They have spare licenses because a subsidiary just canceled a build on said platform.

I know the selected platform painfully well. I know what it cannot do. I know some of the basic requirements this organization will have for communicating their offers to the market. And I know the spin-off business will need to replace this system almost immediately.

I need a new CMS

The reality of most content management implementations is that the process does not support creating an optimal author experience. The common sequence is the following:

Someone identifies that a new or replacement system is required.

Because the system requires a major capital expenditure, the IT department is tasked with fixing the problem.

The IT department creates a list of functionality and features based on their perception of policy and end-user needs.

An RFP is issued, and a new platform is selected.

Designers and information architects are brought in to figure out how the system will work; maybe content strategists are brought in to determine what the content should be.

The CMS vendor’s implementation partner works to deliver on the promises represented by the RFP, while pushing back on the design and functionality.

The new system is delivered to great fanfare.

Within days – weeks at the outside – those responsible for maintaining the content wonder why everyone went to all that trouble. The new system is no better than the old platform; it does not help.

And so the process can begins again. This brings to mind Einstein’s classic definition:

“Insanity: doing the same thing over and over again and expecting different results.”

—Albert Einstein

This does not need to be the case. There is another approach.

Rethinking the process

If we add two elements to this process, the results become significantly different. (While either one alone would make a marked difference to the outcome, both are needed to get the best results.)

First, we must define the communications the system will carry, starting with the fundamentals: Why do we need the system? What is the purpose of the information exchange? From this, through a few more steps, we can identify and model the content required, at least in the abstract.

Second, once we know what content needs to be managed, we can determine the content management paradigms best-suited to the authors – from initial creation by subject matter experts through copy-writing and editing to distribution through the myriad mechanisms, channels, and devices from which it is consumed. Who must interact with the content? How do those interactions differ at various stages within the process? What are the workflows associated with creating, editing, revising, and archiving the content?

With this additional knowledge, the outcome is very different. Instead of system vendors pitching a list of features and functionality based on matters that have little (ok, nothing) to do with what the system is intended to do, we have RFP criteria based on ensuring the following:

The content managed through the system fulfills a purpose.

The system enables authors to manage that content properly.

I need a new CMS, part 2

The sequence changes to this:

Someone identifies that a new or replacement system is required.

Content strategists are called in to understand the communications purpose of the system and to model the content it must handle.

Author experience consultants map the modeled content to organizational workflows in consultation with all affected parties, ensuring that the paradigms are appropriate to the audience.

The IT department takes the requirements and determines whether a new system is required or whether the existing one can be extended to meet the content and authoring needs.

An RFP is issued, and a platform is selected that is capable of managing meaningful communication.

Designers figure out how to display the content.

Information architects work with the vendor to map the modeled content and workflows into the system’s paradigms.

The CMS vendor’s implementation partner delivers the content management, authoring, and workflow functionality they agreed to in response to the RFP.

The new system is delivered to a user base of subject matter experts, writers, and editors who have been looking forward to a solution based on their needs.

By defining the author experience before choosing the platform, we accomplish two goals. First, the delivered platform is fitter for purpose and, thus, provides a better return. Second, the process benefits the content management community at large because CMS vendors learn that for one more client, they cannot use sales tactics that ignore the actual purpose of the system.

Weighing author experience against user experience

We are all familiar with the concept of user experience: the need to focus intently on – even be obsessed with – the end user. Content strategist Eric Reiss defines user experience as “the sum of a series of interactions” between people, devices, and events. However, in practice, there is a tendency to consider just the end user – usually a customer, an external party – as the user of record.

With most CMS implementations, when author experience is raised as a concern, it is pushed back by the demands of end-user UX. The end users are most important; we must ensure that the system engages them, keeping them busy with our content (buying our stuff or racking up hits that work toward our advertising revenue).

How can the author be more than an afterthought when there are valuable customers to be wooed?

The mindset driving most UX is that un-engaging content does not retain users, and a difficult-to-use environment encourages them to go elsewhere to fulfill their needs. If they have no choice but to use our poorly implemented system, they will become frustrated, leading to a combination of sloppy use and negative reputation.

That is a compelling argument to make UX your number one priority.

However, this argument ignores a fundamental detail. Authors are users too; the same psychological consequences apply to them. And given that the authors are responsible for managing the content that is supposed to engage end users, we cannot afford to use a platform that frustrates authors and keeps them from developing high-quality content.

The CMS must be valuable to the people who use it to manage content. After all, a tool that is valued by a worker is used by that worker.

If you don’t invest in author experience, the value of the CMS is severely degraded and the end-user experience is jeopardized. If you are spending money on a content management solution, it pays to consider which parts of the system provide the greatest overall returns. These will likely not be the quickest wins.

Taking the lead in improving AX

As you can see, author experience is a critical element in managing content, and it becomes increasingly important as organizations embrace multi-channel communications. But the available tools and people’s mindsets can get in the way. Chapter 4, The Challenges to Good Author Experience, contains examples of issues that hinder us – problems we must address to bring more tangible returns from the managed content.

But first, we have to deal with the question of responsibility. Whose role is it to ensure that author experience is given its due? Who fights for the budget? Who holds developers, implementers, and system providers to account? Who champions the communications process?

There is no simple answer, except to say: you do.

From a business perspective, anyone who understands the value of reducing risks associated with managing communications must be on board. If you own the budget for marketing, content management, or support, then author experience must be a high priority. When someone tries to siphon off funds to a pet project that will allegedly provide a quick win, the real price is in the stability and coherence of your content, putting at risk the very thing you are responsible for.

From a content manager perspective, I think it goes without saying: a good author experience makes your life simpler.

From a developer or implementer perspective, you have a responsibility to your clients, not only to provide them with what they explicitly request but to understand the purpose served by the system you’re developing. Don’t trade authoring functionality necessary to that purpose for something else, even if that something else is the latest “in” thing that everyone else has.

The business of artificial intelligence MIT’s Erik Brynjolfsson and Andrew McAfee have written what may be the best current article for executives on what AI can, and cannot, do for organizations. Read More Google Glass 2.0 is a startling second act Well, I don’t think anyone should startled by this. “Heads up” displays have been around in some […]

Is content marketing effective for highly technical industrial buyers? The old wisdom says no. It says that manufacturers and their customers are immune to content marketing messages that lack the granular, technical details. Details that make or break a sale.

But data from Content Marketing Institute’s 2017 Manufacturing Content Marketing survey show that the benefits of content marketing are becoming clearer to manufacturers who devote the time and resources to these marketing programs.

That must mean that technical buyers are buying in.

Manufacturers are gaining ground

One reason content marketing is viewed as ineffective in manufacturing is that fewer manufacturers have these programs in place, compared to other marketing sectors. But that’s changing rapidly. Manufacturers are adopting content marketing programs at a quickening pace. 31 percent reported having documented content marketing strategies compared to just 18 percent a year ago.

The data also point to manufacturers beginning to understand content marketing in the same ways other industry sectors do:

Recognizing how content marketing plays into overall brand experience, focusing on quality over quantity, and understanding that results take time, are critical pieces of a solid content marketing program. B2Bs with successful programs know this; these data show more manufacturers do, too.

Measuring maturity in content marketing programs

Manufacturers are largely new to content marketing compared to other industry sectors. Only 19 percent of surveyed firms claimed to have mature or sophisticated content marketing programs.

But what of the remaining 81 percent?

37 percent said their content marketing programs were in adolescent stages; they had developed a business case for programs, seen some success with them, and improved at measuring and scaling them

28 percent claimed to be in young stages where growing pains were evident as they tried to develop cohesive strategies and measurement plans

Note that commitment among manufacturers to develop and/or maintain their content marketing programs is strong:

49 percent claimed to be extremely or very committed to content marketing

59 percent said their organizations have been much more or somewhat more successful thanks to their content marketing programs

Finally, consider this other important point: 19 percent of manufacturers reported their overall content marketing approach has been very successful while 49 percent considered their approach moderately so.

We know content marketing takes time to produce results, and we know that manufacturers are committed to maintaining or bolstering these programs. If the trends hold, the big bubble of adolescent- and young-stage programs will soon burst into maturity.

What drives success?

According to the survey, a few factors emerged as critical to successful content marketing programs:

82 percent of manufacturers said that creating quality content factored into their success

57 percent said spending more time on content marketing was key to their success

Notice that the bottom three points above are crucial steps en route to the top point—creating quality content. Keep that in mind when you consider what manufacturers said contributed to stagnancy in their programs:

67 percent of manufacturers claimed not spending enough time on content marketing factored into stagnation

Are these programs worth the effort it takes to devise, implement, monitor and measure them?

More than a quarter—28 percent—of manufacturers said they could demonstrate that their content marketing programs led to lower customer acquisition costs. That means manufacturers are not only buying into content marketing, they’re getting better at measuring the data that makes the case to keep doing it.

Optimism trumps uncertainty

While the data show that more manufacturers believe in content marketing enough to try it, some uncertainty about these programs remain:

A third of surveyed firms said their organizations were clear about what a good content marketing program looks like

A third said their organizations weren’t clear on this point

A third weren’t sure either way

Despite this uncertainty, it looks like manufacturers are pressing forward: 97 percent of them said they planned to create at least the same amount of content this year as they did last year; 68 percent said they planned to create more.

If some manufacturers still wonder whether these programs can speak to their buyers, the data show they already do.

Summary

A persistent claim in the B2B marketing universe is that content marketing programs are a poor investment for manufacturers. But data from Content Marketing Institute’s 2017 Manufacturing Content Marketing survey show that the benefits of content marketing are becoming clearer to manufacturers, and that must mean that buyers are buying in.

In Public Relations we know what makes a successful apology and what doesn’t.

SUCCESSFUL APOLOGY = a conversational, human approach

Hi Shelly

We’re sorry we got the date wrong for setting up the internet at your new home. We know that was really inconvenient.

Thanks for letting us know about the mistake so we could fix it. We’ll do our best to make sure it doesn’t happen again.

Thanks,

Your Favourite Internet Provider

UNSUCCESSFUL APOLOGY = the traditional, formal business voice

Dear valued customer

It is with regret that we write to express our apologies for the recent error.

There was an unavoidable disruption within our system due to a service upgrade. We apologise for any inconvenience this may have caused.

Sincerely

Just Another Nameless Faceless Internet Provider

I imagine you would struggle to think of the last time you were happy to read something written in that voice. And I’m not just talking about apologies. So I want to say this to you:

The formal business voice is DEAD.

There is no longer ANY place for it in business today. I cannot find a single situation where the formal is helpful.

Oh no wait, that’s not true. There is one time: If you want to threaten, use the formal voice.

“Should the undersigned not comply with the aforementioned conditions, immediate remedial action will be undertaken.”

If you want to alienate and intimidate and put the fear of God (or the courts) into someone, use the formal voice. If you want to achieve almost anything else on the planet, use a conversational voice.

I imagine that so far you’re reading and thinking, well duh, that’s obvious.

But here’s something I’ve learned through training thousands of people to write better in business contexts: Our writer selves don’t know what our reader selves do.

You know good writing. When you read (at work) you want clear, straight to the point, no fluff, no mucking around. But when you sit down to write, a completely different set of knowings takes over, and we completely forget what we know as readers (or we think we’re different. Special. Unusual because we want those things. We’re not – sorry ‘bout it. Everyone wants concise, clear, direct writing).

Our writer-selves believe:

there are unbreakable rules for good writing at work (and we learned them at school/university)

the examples of bad writing that we see all around us (that we HATE to read) are what’s expected of us in a business setting, period

Are you scared?

You wouldn’t be alone. I may have just shaken your foundations.

Alan Siegel, who’s known internationally for his work simplifying legal documents (while retaining all their legal power), describes what he does as “a means to achieve clarity, transparency, and empathy – building humanity into communications.”

I LOVE THAT because right there is my issue with the formal business voice and why I say it’s DEAD: The formal business voice removes the humanity.

It takes out the people. It takes out the you, we and us, and switches to third person – the client, the user.

It removes ownership and accountability and instead just talks about things “happening”, like:

Mistakes were made. [Isn’t that wonderful? They just happened. No one is to blame.]

It is recommended. [By whom? The universe?]

Don’t believe me?

People have been researching this stuff for decades. And we know that a simple, conversational voice is far more successful when compared to the formal voice:

It’s shorter

It’s easier to understand

It’s more engaging

It deescalates situations rather than escalating them (the formal voice sounds pompous and the last thing you want when tensions are high is to sound pompous – cos that helps. )

Still don’t believe me?

Think about brands you love.

Think about how they write to you – by email, in agreements, terms, and conditions, on the web. They have a conversation with you. They don’t talk down to you. And you know what? If THEY can use a conversational, everyday voice and drop the formality in their business writing, SO CAN YOU.

Emoji Does It!

Google is notorious for having some of the worst emoji on the planet. Now it’s righting its wrongs–and taking on gender stereotypes, too. The effort aims to highlight the diversity of women’s careers and empower young girls.

Designing Content for Voice Interfaces

Voice is the new black—One day soon, we might talk to our devices the way we talk to our friends. Not in commands (like we do currently), but in conversational language. Our devices will talk back to us—and they’ll sound like people we want to speak to.

It’s definitely one reason our upcoming conference, Information Development World (November 28-30, 2017), will focus on helping you learn to craft dialog for chatbots and voice interfaces. Writing content for voice and chatbot interfaces will require us to develop new skills and ways of thinking about content. Visit the event website and sign up to be alerted when the roster of presenters goes live.

“Conversations are the new interface,” says artificial intelligence expert, Joe Armstrong. “Conversation designers are the new UX designers.”

To help rid Alexa of its cyborgian cadence, Amazon recently upgraded its “speech synthesis markup language tags,” which developers use to code more human-sounding verbal patterns into Alexa.

The new tags allow Alexa to do things like whisper, pause, bleep out expletives, and vary the speed, volume, emphasis, and pitch of its speech. This means Alexa and other digital assistants might soon sound less robotic and more human. But, until then, you’ll need to speak to Alexa using commands she understands.

Enabling access to the web using the spoken word involves preparing content in compliance with several content standards, one of which is Speech Synthesis Markup Language (SSML).

Understanding Artificial Intelligence

Computers do a lot for us these days and can sometimes seem intelligent like a person. But really, they’re more like calculators. Most only do what we program them to doand nothing more. But this is changing—and quickly—thanks to neurosynaptic chips and other advances in computing technology that enable machine learning.

The Marketing Technology Bootcamp is co-located with the Gilbane Digital Content Conference this year. Call for Speakers Deadline Fast Approaching The Call for Speakers deadline is fast approaching. If you would like to speak at the first annual Marketing Technology Boot Camp, co-located with the Gilbane Digital Content Conference May 28-29. Submit your proposal today. We […]