[This is my first #techcomm blog post in about two years. That’s partly due to the fact that I no longer work in a conventional #techcomm job, but mainly because I’ve become a bit lazy in my old age!]

Write the Docs meetup on documentation formats

Last week I went to a WriteTheDocs meetup in London, where three speakers addressed the topic of “Format the Docs”, not in terms of visual appearance, but in terms of what format was best for technical documentation (details). The three formats discussed were DocBook, DITA, and somewhat surprisingly to some people, Microsoft Word.

DocBook presentation

The first speaker, Peter Desjardins (see details), explained that DocBook, an XML format, began to be developed in the early 1990s and was developed from SGML. DocBook is very common in open source projects, as it is easy to publish, easy to automate, and suitable for Apache based systems. It has an explicit structure, which means it’s easy for processing tools to work with. There are various free and commercial editors available for DocBook. Plain text source formats like DocBook can be stored in GitHub, which developers like. Using GitHub makes it easy to track and control changes, and easy to fork or branch. It also works well with automated build and continuous integration tools.

Some people complain that DocBook has too many elements. Peter pointed out, however, that if you are writing about Java classes and you can use a DocBook element called MethodDescription, that’s great. You can output DocBook through various XSLT Transformations (for example web help, or PDF), and some people feel that it takes a lot of effort to master XSLT and FO, but there is a load of helpful material available, so you can learn to write your own custom XSLT – Peter assured us tht looks more scary than it is. I wasn’t entirely convinced.

DITA presentation

Bridget Rooney was the second speaker and described DITA, another XML tool for documentation, that separates content from presentation (see details). The DITA Open Toolkit (DITA-OT) is available for publishing. DITA users tend to compose content in separate topics, with standard element sets for Tasks, Concepts and references, and then use DITA-maps to select the topics you want to include in a particular publication. This makes DITA very suitable when some of your content can be re-used across many different publications.

Bridget acknowledged the disadvantage of DITA was that, as with DocBook, technical skills are needed for transforms and publications and explained that her company had a 2-stream team working on publications consisting of writers and “Doc-Ops” – programmers dedicated to supporting the publication process.

As someone who used to know a lot about DITA, Bridget’s talk made me realise that there has been significant progress in editing tools since I last used DITA, but that many of the obstacles to easy publishing still remained.

Word presentation

Richard Stamp was the last speaker and declared that Microsoft Word was actually a very powerful tool, very configurable, very customizable, mature, and stable tool, and is available on Windows PCs and Mac (see details). He said that the problem with Word wasn’t the tool itself, but the fact that 99% of users aren’t professionals, and therefore it was up to the 1% of professional users – such as #techcomm professionals – to show exactly how much could be done with Word.

Richard described three projects that he’d been involved in at different companies. The first made use of Word styles, and went beyond merely encouraging consistency. The project locked down the available down styles and disabled manual formatting, and customised the ribbon to make things more acceptable to non-professional users. Richard considered the customised ribbon to be critical for user acceptance. There were different templates available, each with their own styles, macros (linked to buttons on the ribbon) and autotexts.

The second project related to ensuring visual consistency for website content created by non-technical users, where locked down templates were not an option as each contributor was working independently from home. The solution involved porting from Word to Markdown (not too difficult technically, just a series of find-and-replace macros), and then from Markdown to Drupal (far more complicated but only needed to be set up once). Richard acknowledged that this solution probably would not have worked there had been tables involved. The third project involved procedure-heavy documentation using DITA, and publishing with the DITA_OT, but using Word as an editor. This involved a lot of round-tripping through various VBA routines and was feasible because Word’s .docx format is really just a ZIP file full of XML-like elements, and because the project was at a really technical company.

I apologise to all of the speakers if this brief summary hasn’t done them justice.

So what about Word, after all?

I very much enjoyed these three talks, and in particular I was pleased that there was a robust comparison of Microsoft Word to two other more technical formats in front of a very techy audience. If you’ve ever heard me speak at a #techcomm event or read this blog for a while, you’ll know that like many of my peers I have a love-hate relationship with Microsoft Word. It’s a great tool for some jobs, and it’s terrible for others.

In my experience the usefulness of Word decreases as the length or complexity of the document you’re trying to write increases. In my current job, it’s the only authoring tool I need, as I rarely need to deal with a document more than 30 pages long, and I don’t use lots of graphics or any cross-referencing. I will readily admit that current editions of Word are orders of magnitude better than the versions we had 10 or more years ago, and like Richard, I acknowledge that 99% of Word users aren’t professionals (and that’s probably an underestimate). I have occasionally used this blog to advise non-professional users of some basic tips (here, for example).

Richard’s talk explained how it was possible to do really useful work with Word, but in doing so he was actually supporting an argument that I have made several times in the past. In comparison to other content authoring tools (and my counter example was usually Adobe FrameMaker) it is quite difficult to do complicated things with Word “out-of-the-box”. Once you have the skills to create your own macros and ribbons and templates, and when you have the management and IT cooperation necessary to lock down Word to only allow your templates, then you can make great strides, as Richard explained. But in my experience, even though some #techcomm professionals do have the skills to create the templates, very few of us are able to get the organisational cooperation needed to enforce them. So we are left with Word being widely and poorly used across an organisation, while tech writers are expected to publish faultless documents without the necessary support. Writing teams that ask for more are often told “but you have Word, what more do you need?” Imagine a Finance team asking for accounting software being told they already have Excel?

The Lavacon conference on Content Strategy and Tech Comm Management is taking place this year in New Orleans. It’s one of the tech comm conferences I’ve never managed to get to, but I’d love to attend as they always have such a great line-up of speakers, and this year is no exception – and I’d love to visit New Orleans too. This year LavaCon are offering their Virtual Track once again, which means you can attend a whole range of sessions from your own home (or office). (Read on for news of a special offer for Marginal Notes readers!)

LavaCon is spread over four days and features speakers from leading companies including Adobe Systems, IBM, UBS, Boeing, Intel and PayPal among many others. It starts with a day of workshops on Sunday 18th October, followed by three full days of presentations on Monday, Tuesday, and Wednesday 19th – 21st October 2015.

All the keynote sessions at LavaCon will be broadcast live in the Vitual Track. In addition, one breakout session in each time slot is also included in the live video channel. The other four sessions will be broadcast in webinar format (slides and audio), so Virtual Track attendees can attend any session remotely.

The keynote sessions are all 18-minute TED-style presentations, and some of my favourite speakers are taking part. Andrea Ames from IBM (who is also speaking at TCUK this year) will be speaking about how you can improve your influence quotient, which can be so important if you’re the only person in your organisation who is promoting the importance of Content. Vici Koster-Lenhardt will use her keynote session to show how her career successes have everything to do with the soft and hard communication skills won fighting the battles in newspaper and magazine publishing, techcom, and corporate communications.

Noz Urbina, is another keynote speaker, and he will explain how to position content so that the people in your organisation with “the power and the purse strings” will notice. His keynote will be followed by a full session examining this vital question in more depth. Joe Gollner will address a similar theme in his session on the “Dark Arts of Content Leadership”.

These sessions are just a tiny fraction of what’s on offer at Lavacon 2015, and I really wish I could be there. But if I was there, I’d probably be driving myself mad trying to choose between the five tracks of presentations: Content Strategy; Content Production; User Experience; Tolls and Technology; and, Management and Governance. If you do get an opportunity to be there – well, I’m jealous!

I’ll be reporting back on the LavaCon Virtual Track after the event.

Special offer for Marginal Notes readers: Benefit from a 20% discount when you register for the LavaCon Virtual Track when you use the discount code “dfarb”.

Disclosure: David is attending the 2015 Virtual Track as a guest of LavaCon.

The Technical Communication UK conference 2015 (TCUK 2015) is taking place in Glasgow from 29th September to 1st October. The constant changes in the industries we serve, along with the changes in the tools and methodologies at our disposal, have prompted many technical communicators to think about transforming their working practices, or transforming their own career paths, or both. We are all capable of Breaking the Boundaries of Technical Communication and using our skills and expertise in new ways, or in new areas, and that’s the unifying theme for this year’s conference.

One of the reasons I’m looking forward to TCUK 2015 is because of keynote speakers who each show ways in which we can challenge and break through the boundaries and constraints we face.

Andrea L. Ames is a Senior Technical Staff Member and Enterprise Content Experience Strategist/Architect/Designer in IBM’s CIO, where she enables strategic use of IBM’s high-value content assets for the most client delight and success and highest IBM business impact. Her keynote will help you to get started or to progress along your already change-embracing path by showing us that to move beyond survival in our industry to thrive and find great success, you must become comfortable with ambiguity and learn to love change.

Chris Atherton returns to TCUK as our third keynote speaker this year. Chris is a Partner at Equal Experts, where she engages in user research and user-centred design to help clients transform their software delivery processes. Originally from an academic psychology background, Chris got interested in how people process visual information on screens, and subsequently ran off to join the software industry. Many technical communicators would love to talk to real users but find many obstacles in their way. No budget? No user experience colleague? No problem. Chris will demonstrate some simple ways of getting started, persuading people to join you, and then scaling things up.

Unfortunately Murray Cox, who was originally announced as a keynote speaker, is now unable to attend TCUK 2015. His place will be filled by Neil Perlin. Neil is an internationally-known online content consultant, trainer, and columnist, and is certified in both Adobe RoboHelp and MadCap Flare. Neil’s keynote presentation is on “Breaking Our Own Boundaries”, looking at ways to break out of self-imposed boundaries – expand, extend, and avoid pigeonholes – to help us move in challenging, well-paying, and fun directions.

You know those clickbait sites that tell you how much of your life you’ve spent asleep or in the lavatory? Well, I’ve been working in hi tech for more than twenty years and I feel I have spent most of that time upgrading Microsoft Windows.

My first role on a technical writing team was being the very junior person who gets all the dull jobs. One of mine was running the upgrade from Windows for Workgroups 3.1 to Windows for Workgroups 3.11 on a dozen desktop PCs. (They sure knew how to name their releases back in those days.) About a year or so later along came Windows 95, and we all thought the future had finally arrived. Those were the days. So when it comes to upgrading Windows, it’s all a bit “been there, done that, lost the t-shirt ages ago” for me. Could Windows 10 impress me?

The first thing that did impress me was the price. Windows 10 is available free of charge to anyone with Windows 7 or Windows 8 for at least the next 12 months. That sounds innovative, except when you remember it’s what Apple have been doing for ages.

There have been a lot of positive reviews about the new interface, and this one in the UK magazine PC Advisor is typical. The interface is certainly modern, and clean, and it combines characteristics of both Windows 7, which was an evolution of the older familiar Windows we’ve known for a long time (a very long time in my case), and the Windows-phone style tile-based interface used in Windows 8 and 8.1. People who didn’t like the Windows 8 design will be pleased to see that the Start menu is back – even though it’s full of those annoying tiles.

Windows 10 doesn’t seem to refer to “programs” any more, only to “apps”. To me it seems that Windows 10 is simply using the more fashionable term, and I don’t know whether it’s even worth discussing whether there are technology differences between apps and programs any more. OK, I’m an old fogey, I know.

Microsoft has proudly asserted that Windows 10 will be available for all platforms – PCs, tablets and phones – so that users will have a seamless experience. That sounds very nice, but it relies on a number of assumptions. One is that you are happy to store all your important files in the cloud, specifically Microsoft’s own One Drive. My personal experience with One Drive has been far from positive. The next assumption is that all your devices are constantly on, and constantly connected to the internet. And they also assume that you are, by default, happy to share everything with everyone, particularly with advertisers, and you’ll just love getting recommendations about what to buy, where to eat, and what to listen to based on what you bought, ate or listened to recently. If that doesn’t exactly describe you – for example, if you actually use a computing device to do your work, rather than just to share things with other people – you’ll need to change a lot of Windows 10’s default settings.

Another thing to be aware of is that with Windows 10 updates are always on. In contrast to previous versions of Windows, you can’t choose not to have automatic updates. That does mean you’ll always have the latest security and operating system patches which is good, but it also means you can’t control when those patches are downloaded and installed, which may cause some problems. It’s all getting a bit too Orwellian for my taste, to be honest.

How I ran my update

I ran my experimental installation of Windows 10 on an older laptop that had been cosmetically repaired (new screen, new keyboard, webcam broken beyond repair). It was running 64-bit Windows 7 Home Premium with SP1, with a Celeron 2.00GHz processor and 3GB RAM. This was within the range of machines that could be upgraded, but definitely towards the low end of that range.

If you are running an upgradeable version of Windows, Microsoft has been encouraging you to register for the free Windows 10 upgrade and wait your turn, but if you are impatient like me you can download the upgrade yourself. I found my way to https://www.microsoft.com/en-us/software-downloads/windows10 which is a page intended for network administrators, so what you are downloading is a Media Creation Tool.

The download offered two options – directly upgrading your PC or creating an installation ‘disc’ (on a DVD or a USB stick). I chose the direct upgrade first time, which failed, and the USB option the second time, which was successful. You would need a USB stick with at least 6GB capacity. The download process also warns you that you may need your Windows Product Key, which is often found on a sticker on the back of your laptop. I do have the original sticker with the Key on my laptop, but in practice I wasn’t asked for it. The download took about an hour, during which time you can still use your PC, and then the installation took about 2.5 hours. The first installation looked OK, but the start menu and the taskbar didn’t respond to any mouse clicks at all. Luckily I had chosen to preserve my “files and apps” so I could launch a browser from the desktop and start again.

You should also be aware that Windows 10 doesn’t like local users, and expects you to have a registered Microsoft account to log in to a Windows 10 PC.

Conclusions

Windows 10 is bright and shiny and if you liked Windows 8 you’ll love it. It’s also inevitable, and eventually everyone will have it and get used to it – even with its dodgy use of terminology, and it’s over-zealous tracking of your every move. I can’t see any real reason to rush ahead and upgrade for yourself rather than waiting your turn for an automatic upgrade, unless like me your idea of how to spend a “fun” few hours on a Sunday afternoon is a little bit warped!

You should however be aware of just how intrusive it’s likely to be, and if like me your answer to questions about how much personalised advertising you’d like to have is “as little as possible, thank you” you’ll have fun finding and switching off all of its default settings. We’ll have to wait and see how stable it is and how mandatory automatic updating works – a week after launch the first patch has already been released – but don’t worry about downloading it, you’ll get it automatically.