Translation TherapyExploring Translation Supporting Technology, Workflows, Business practices, and a general outlet for various thoughts and ideas about Practicing Translation2017-12-26T02:41:38Zhttps://translationtherapy.com/feed/atom/WordPressShaihttp://translationtherapy.com/?p=4732017-12-26T02:19:29Z2017-05-13T15:26:43ZAfter recently upgrading Windows 10 to the Creators Update (May 2017), I ran into an error message that was new to me. When starting Studio 2017 the following two ‘StudioUpdateClient‘ messages popped up: ‘Failed to get archive directory listing’ and ‘Failed to load update data file’ The following video shows those messages at start-up The...

]]>After recently upgrading Windows 10 to the Creators Update (May 2017), I ran into an error message that was new to me. When starting Studio 2017 the following two ‘StudioUpdateClient‘ messages popped up:

‘Failed to get archive directory listing’
and
‘Failed to load update data file’
The following video shows those messages at start-up

The same happened when trying to manually check for updates (Help > Check for updates [in the Action ribbon group]) and when launching Multiterm.
There was clearly something wrong with Studio’s update module.

I tried repairing Studio’s installation (SDL KB article number: 000001414), as well as resetting Studio (SDL KB article number: 000001417), but to no avail. I then decided to contact SDL and with their help we have indeed isolated the problem to a corrupt Update module.

There is plenty that can potentially go wrong–albeit seldom does–when upgrading the operating system and/or a program, and initially I suspected Windows upgrade might have broken the Update module. However, other users who have also upgraded Windows didn’t experience this or any other issue, and therefore I concluded this was unrelated. Something, somewhere, went wrong but there is nothing much to learn from it.

That said, the problem of not being able to update Studio still remained and I wanted to get it fixed.

How to repair Studio’s Update module and resolve the ‘StudioUpdateClient’ error messages

It is very easy to repair the Update module:

Close Studio and/or Multiterm;

Navigate to ‘C:\ProgramData\SDL\SDL Trados Studio\Studio5\’ and delete or rename the ‘Updates’ folder to ‘Updates_old’
Please note that the above path is for Studio 2017 and the ‘Studio5‘ part of the path should be replaced for earlier/future versions of Studio according to the relevant Studio version number, e.g. ‘Studio4‘, ‘Studio6‘… and so on)

Repair Studio’s installation by following the instructions in SDL KB article number 000001414 [link].

The ‘The Update application has been updated and must restart in order to continue’ loop

A corrupt update module could also result in the following behavior: When starting Studio or MultiTerm–or when manually checking for updates–the ‘The Update application has been updated and must restart in order to continue’ message appears, but when confirming it nothing seems to happen and you don’t get either the update is available notification or the ‘This application is up-to-date" message, and the same message pops up again the next time Studio or MultiTerm are launched

]]>0Shaihttp://translationtherapy.com/?p=4162017-12-26T02:21:30Z2014-12-16T15:41:46ZThe release of a new major version of a productivity software is always a mixed experience. On the one hand new and sometimes very important and even exciting improvements and features are introduced, but on the other hand so are new bugs and quirks at the early stages, and major new features are often in...

]]>The release of a new major version of a productivity software is always a mixed experience. On the one hand new and sometimes very important and even exciting improvements and features are introduced, but on the other hand so are new bugs and quirks at the early stages, and major new features are often in need of further refinement before becoming reliable.
Things get even more interesting when changes to the user experience are introduced, and the questions about their necessity, efficiency and reception are highly dependent on the implementation and the workflow they attempt to support.memoQ 2014 Release 2

memoQ 2014 Release 2 (abbreviated as memoQ 2014 R2) — the new major version of the Translation Environment Tool — continues memoQ evolution in terms of functionality and optimization, but also introduced an extensive overhaul to the user interface. So extensive in fact it made Kevin Lossner, an expert on all things memoQ and a trainer, to argue that memoQ 2014 R2 should have been called memoQ 2015 to signify the departure from the traditional interface.

A quick overview of memoQ 2014 R2

memoQ 2014 R2 introduces several new features and improvements, mainly:

But the change that gets most attention is the departure from the conventional toolbar-centric interface in favor of the Ribbon interface*.

The advantages of the ribbon

I’m a fan of the ribbon interface and generally find it superior to the conventional toolbars and menus. I always found the old toolbar-centric interface to be cluttered, distracting, and above all: restricting natural workflow and the discovery of functionality. This area is where I find the ribbon to excel the most.

I see the following advantages in the ribbon interface:

A cleaner design that gets out of the way;

The ribbon promotes a logical and more focused workflow;

Discoverability: unlike the conventional approach, in which one first had to be familiar with the functionality in order to look where to invoke it, the ribbon design assists users in discovering functionality as they go along with their work;

Since its introduction in MS Office 2007, the ribbon interface has found its way into other components of the Windows operating system (as well as other major third-party productivity tools), which effectively makes the ribbon a design language that can contribute to a more consistent user experience on Windows as a whole. Something that in my opinion was historically lacking.

The implementation of the ribbon interface in memoQ

When it comes to user experience and workflow in general, it is not the idea that counts, but its implementation. Change for the sake of change is not a virtue, and even the relatively little things can make or break a transition in user experience.
I was curious to learn about the rationale behind the transition to the ribbon interface, and hoped to gain some insight into its design principles. Kilgray was kind enough to help me with this by putting me in touch with Mónika Antunovics — memoQ architect — and she was kind enough to take the time and answer a few questions:

Q: Hello Mónika. Thank you for taking the time to answer a few questions about memoQ 2014 R2. Can you please introduce yourself?

Mónika: Before joining Kilgray two years ago, I spent fifteen years directly or indirectly involved in software localization and internationalization at Microsoft. At Kilgray, most of my time is taken up by designing the new features of memoQ.

Q: The most prominent feature of memoQ 2014 R2 is the transition to the ribbon interface. In the past Kilgray stated that memoQ will never adopt the ribbon interface, and while I appreciate that people can change their perspective overtime, can you please share some insight into the thought process that led from that statement made about two years ago to the eventual introduction of the ribbon interface in memoQ 2014 R2?

Mónika: This is a question that’s really hard to answer; looking back, I couldn’t really identify the tipping point. It’s certainly true that when I joined Kilgray, the opinion about the ribbon from everybody I ever talked to was “over my dead body”. I was surprised at this, since even by that time the ribbon was considered a success even by the most respected user experience professionals. What prompted us to action in the end was the fact that the command pane below the project list and document list was taking up ever increasing space, and it was eating badly into the area where we were displaying the most important information. We discussed this a lot and even looked at a possible alternative solution, which, after some iterations, started to look suspiciously like a ribbon, although a bit less usable. I think the decision was taken after realizing that the ribbon would offer the most in terms of space gain.

Q: Was keeping consistent with Microsoft design language was a factor in the decision making?

Mónika: In my own case, I spent fifteen years there, that’s certainly an influence. Having been part of the Windows development team and participated in some product planning activities, I also had a closer insight into the tremendous amount of resources they have at their disposal for planning and usability studies. Needless to say, a small company like Kilgray can’t match that, so from the pragmatic point of view, why not use solutions Microsoft developed at their own cost and which have stood the test of time? There is also the principle of giving the user something she is familiar with, as it’s much easier to find the way around an interface that is built from well-known elements.

Q: Did the recent adoption of the ribbon interface by SDL in Studio 2014 and Atril in Déjà vu X3 played a role in the decision?

Mónika: I think the fact that our competitors are doing the same just shows that this is common sense – but if SDL and Atril moving in that direction influenced us in any way, I would say it was more delaying the inevitable ;-). Kilgray has never hidden its respect for its competitors such as SDL and Atril, but we do not copy other tools slavishly.

Q: In my humble opinion, the ribbon interface is more than just a cosmetic user interface (UI) design change, it is a change to the user experience (UX). This is probably the reason for some of the skepticism by experienced users concerned about possible disruption to their established workflow and ergonomics. From my experience, I estimate that they ribbon user experience will find its success and support, but how concerned was Kilgray about negative reaction from experienced users?

Mónika: There was some stage fright, but the first user tests reassured us that this was the right decision. Quite a few customers gave generously of their time and provided feedback on the ribbon. Their reaction was overwhelmingly positive; words like “intuitive” were mentioned more than once. Really negative feedback came about the fact that, with the old menus gone, some hotkeys inevitably changed, and people who relied on them were quite understandably upset; unfortunately this is something we can’t put right (on the other hand, shortcuts like Ctrl+Enter for committing a segment or Ctrl+F for Find all work as before). There were some suggestions about commands that could be more logically placed, although no two people said the same.

Q: I’m generally a fan of the ribbon and think Kilgray did a good job utilizing its advantages in memoQ 2014 R2. Can you please walk us through memoQ’s ribbon design principles?

The anatomy of memoQ 2014 R2 ribbon interface

Mónika: The order of the ribbon tabs mimics a typical workflow – first you create/manipulate a Project, then do some work importing Documents, followed by Preparation of said documents for translation. Then you hand off your project for Translation and Review – during these phases linguists Edit a lot. Finally, you can influence memoQ’s appearance from the View tab.

We have also two special tabs: the tab called memoQ opens up the application menu, i.e. a surface where you can access commands that influence the behavior of memoQ in general (this is where you can activate, set options and access help, to name just a few). The other very special tab is called Quick Access – this one was designed with translators in mind, and gives them commands from other ribbon tabs that they are most likely to want to use during translation.

We have also many so-called context tabs, which appear only in a given context, like editing a TM or extracting terminology.

Q: I noticed that the Settings menu and few other panes kept their old design, which personally I find a little cluttered. Are there any plans to redesign them as well?

Mónika: We always have many more ideas than we have the resources to implement :). I personally would love to embed the settings fully in the application menu, but having seen that Options is a pop-up window even in Microsoft Word, I do wonder when we’ll finally get around to doing that.

Q: In my opinion, a good user experience must include some element of customizability. Unlike Microsoft Office’s ribbon, memoQ’s ribbon currently doesn’t offer any customizability options, and I think that for the very least the Quick Access toolbar should be customizable. Are there any plans to make the Quick Access toolbar user customizable?

Mónika: Definitely! In fact, we wanted to make the Quick Access toolbar customizable from the very start, but some technicalities I won’t go into prevented us doing this in time for the release. Stay tuned.

Q: Thank you for taking the time to answer these questions Mónika, it is much appreciated. Do you have anything you want to add?

Mónika: There is much more to memoQ 2014 R2 than the ribbon – we’ve revamped the TM editor, introduced a major usability improvement into segmentation rules, and translators can now even share TMs and TBs via Language Terminal with up to three people, which can be very useful for small, informal teams of translators. Go ahead, download, try and enjoy!

Summary

I witnessed a few transitions to the ribbon user experience and a reaction pattern has emerged: The announcement is followed by immediate skepticism, but as more users actually start using the ribbon user experience, the reaction gets increasingly positive.
Kilgray did a good job in implementing the ribbon and utilizing its inherit benefits, and therefore I’m convinced memoQ 2014 R2 will follow the same pattern. By now, I think the ribbon user experience has proven itself enough to generally become a non-issue, and even a preferred feature when implemented correctly.

It is reassuring to learn that the Quick Access toolbar will become customizable in the future, and while ideally I would prefer to be able to add an entirely new user customized tab to the ribbon, the Quick Access toolbar is a good start.

In memoQ 2014 R2 the ribbon might get most attention, but some important functionality foundations were laid for the future, and the usual further optimization to current features is present as well.
As always, this is a generally safe upgrade for those who like to be early adopters, but for those who prefer a more conservative approach it is advised to wait a few months before upgrading.

Footnote* A little terminology anecdote: The term ribbon was traditionally used to refer to what is now being commonly known as the conventional toolbar-centric interface, whereas what we refer to as ribbon is actually called by Microsoft the Fluent User Interface (FUI), which I find to be a much more appropriate description of it. Back to the article

]]>8Shaihttp://translationtherapy.com/?p=4012017-12-26T02:22:20Z2014-11-11T18:17:43ZStudio 2014 SP2 was officially released on November 18, 2014. The official version includes some additional bug fixes compared to the Beta version I reviewed, but not new features were added or removed. During the typical life-cycle of any major Studio version there are two types of updates: Cumulative update: A scheduled release of hotfixes...

]]>Studio 2014 SP2 was officially released on November 18, 2014. The official version includes some additional bug fixes compared to the Beta version I reviewed, but not new features were added or removed.

During the typical life-cycle of any major Studio version there are two types of updates:

Cumulative update: A scheduled release of hotfixes to address customer impacting bugs, as well as critical performance and/or security issues.

Service Pack (SP): A collection of updates and fixes that improves the overall stability of the software, and usually introduces some new features, or enhance the functionality of existing features.

SDL was kind enough to give me access to the Beta version of the upcoming SDL Studio (and MultiTerm) 2014 Service Pack 2 (SP2), scheduled for release as a free upgrade for all current Studio 2014 users at the end of November 2014.
I have been testing the Beta version for about two weeks now, and decided to briefly go over the primary changes and give my general impression so far for the benefit of those who might be interested to know what is coming.

Enhanced Terminology Editing

The most noticeable change introduced by Studio 2014 SP2 is the long awaited departure from Java, and with the two new terminology editing commands: Quick Add New Term and Delete this Entry, this results in major overhaul to the terminology management workflow and performance.

The Removal of Java

Starting with Studio 2014 SP2, Studio and MultiTerm Desktop are no longer depend on Java. The dependency on Java was arguably the main pain point for many users, and its removal should significantly improve the overall user experience. The new Java-less Studio together with the Open eXchange Glossary Converter app simplify what was only recently a dreaded terminology management workflow almost beyond recognition.

The new Quick Add New Term option

In the traditional workflow users had to click the Add New Term button, edit the terms in the Termbase Viewer window as appropriate, and then click the Save Entry button. Even without the slowness and instability issues of Java, this multistep process is a bit cumbersome and prone to human errors (does forgetting to click the Save Entry button sound familiar to anyone?), especially when working with basic glossaries that contain only the Source and Target terms without any additional fields.
The introduction of the new Quick Add New Term option in Studio 2014 SP2 greatly simplifies this workflow and adding terms to the active Termbase on-the-fly was never easier.

Note: Users that use termbases with additional fields (such as definition or context) and want to edit these fields before saving the term in the termbase should continue using the traditional Add New Term workflow.

Deleting a Termbase entry from within the Editor window

The second enhancement to the terminology editing workflow is the addition of the Delete this Entry button to the Termbase Viewer window, for quickly deleting a Termbase entry without leaving Studio’s Editor window. A nice time saver.

Tag Verifier is now a global setting

Traditionally, the Tag Verification settings in Studio was tied to each supported filetype. Studio 2014 SP2 replaces the old filetype-specific settings with a new global tag verification settings (under File > Options > Verification > Tag Verifier).
Although it doesn’t seem like much in a first glance, I think that this is an important change that will minimize confusion and human errors.

Sorting Translation Memory (TM) Results by date

Judging by the number of times I was asked if this is possible, this new feature of Studio 2014 SP2 should be greatly appreciated by many users.

The new Show most recent translation first option sorts TM results by Match percentage > Last edited date > Last added date, thus giving the user a finer control over how the TM and concordance search results are displayed.
This is the new default setting (accurate to the time of this writing), but users that prefer the old behavior can easily switch back by going to Project Settings > Options > Language Pairs > [Relevant language pair] > Translation Memory and Automated Translation > Search and dis-selecting the first checkbox: Show most recent translation first.

Source segment editing for more supported filetypes

The source segment editing feature, first introduced in Studio 2011 SP2, could come real handy at times. It is a simple way to clean up that odd messy source segment, correct a typo, and so forth without leaving the Editor window. It also became a way to work around Studio 2014 segment merging limitations as a crude ad-hoc mechanism for making simple segment merging in poorly prepared documents.

However, source segment editing has some limitations:

Available only for Microsoft Office documents (Word, Excel, PowerPoint);

Starting in Studio 2014 SP2 source segment editing is available for most supported filetypes, but the other limitations in the above list remain unchanged.

Alphanumeric characters now added to the list of recognized tokens (placeholders)

This is quite significant. Studio 2014 SP2 now recognizes a combination of alphanumeric and the following characters as tokens (i.e. placeholders that are transferred directly to the target segments):

Uppercase letters (VGN-FW550F)

Numbers (M10X30)

Underscores (NAME_4001C)

Dashes (VGN-FW550F)

Full stops (BV0.mxm.072.531)

Activate this new option under File > Options > Language Pairs > Translation Memory and Automated Translation > Auto-substitutions, and then reindex all relevant existing TMs, otherwise the change will not apply to them and only to newly created TMs.

I tested this and it seems to work well (but don’t forget to reindex existing TMs).

Changes to the word count mechanism and search logic when handling words containing apostrophes, dashes and full stops

As SDL describes this change:

Studio 2014 SP2 uses an improved algorithm for processing words that contain dashes (-) or apostrophes (‘). This improvement translates into:Lower word count. Studio no longer treats apostrophes and dashes as word separators, but as punctuation marks that link words together. This means that Studio counts elements like “it’s” or “splash-proof”, “NAME_4001C” as one single word.
Apostrophes that do not follow the new logic:

Apostrophes followed or preceded by space. For example, “the ‘90s” or “girls’ night” both contain two words.

Right single quotation mark (’)

Dashes that do not follow the new logic:

Figure dash (‒)

En dash (–)

Em dash (—)

Horizontal bar (―)

Small Em dash (﹘)Higher fuzzy matches. When searching for matches in the selected TMs, Studio considers apostrophes and dashes as separate tokens inside the words they link together. This means that when comparing words where the only difference is the type of dash or apostrophe used, Studio only penalizes the difference in punctuation and not the entire word.
Important: Re-index your existing TMs before using them in Studio 2014 SP2. This synchronizes the TMs with the new logic for counting and matching words that contain apostrophes and dashes.
These options are available on the Translation Memory view > Home tab > Import > General Import Options page.

Personally, I don’t understand the logic behind this change. A cynic might say that the lower word count and higher match percentages could benefit some more than others, but I will reserve my judgment until I’ll clarify this with SDL.

Some additional notable changes

A text replacement penalty is now applied to acronyms and alphanumeric placeholders. When activated, a replacement penalty could be assigned to acronyms and alphanumeric placeholders to indicate that they were transferred directly from the source segments (i.e. they match the token identification logic) and not from the active TM(s).

Support for the newer Adobe InDesign/InCopy CC file formats.

SDLXLIFF files are now always included in the return package.

Many more updates and bug fixes to existing features and core components that are transparent to the user but should improve stability.

Conclusion

The highlight of Studio 2014 SP2 is without a doubt the departure of Java, a major friction and frustration point throughout SDL Studio’s history. In my opinion, less dependency on external libraries is always better than being susceptible to changes made by a third-party, even if the trade-off is losing some (peripheral) functionality.

The new terminology module should significantly improve the user experience, and from my experience so far it does just that, indeed. With the addition of the Quick Add New Term option, adding new terms on-the-fly is faster, simpler and more reliable than ever before.

The other new features and improvements are generally a step forward, although their importance and impact differ depending on the workflow and use case. The only exception might be the change to the wordcount algorithm. I will attempt to get a clarification about this, but in the meantime just be aware of that.

So far Studio 2014 SP2 seems to be very stable. As can be expected from software still in Beta, there are also some bugs and quirks, but this is part of the Beta experience and I appreciate it. The most pressing bugs are likely to get fixed by the time Studio 2014 SP2 is officially released, but other bugs might be addressed only in future Cumulative Updates (CUs), which SDL tends to issue on quite a timely manner

I haven’t encountered any "showstopping" bugs or issues even in this Beta stage, so for those who tend to be early adopters I can recommend with a degree of confidence that it is generally safe to upgrade to Studio 2014 SP2 when it is released. That said, I also acknowledge that what seem to one as a minor bug could be significant another, and therefore would recommend those with a more conservative approach to upgrading to wait a few weeks — probably until the release of the first CU — before upgrading to Studio 2014 SP2.

]]>1Shaihttp://translationtherapy.com/?p=3782017-12-26T02:25:14Z2014-09-08T14:38:52ZReaders of this blog and those who might have stumbled upon some of my occasional posts on the subject in social media know that translation supporting technology and workflows are areas I take a special interest in. I’m always grateful for opportunities to learn about new and different approaches and tools. In recent years memoQ...

]]>Readers of this blog and those who might have stumbled upon some of my occasional posts on the subject in social media know that translation supporting technology and workflows are areas I take a special interest in. I’m always grateful for opportunities to learn about new and different approaches and tools.

In recent years memoQ has gradually risen to be one of the major commercial TEnTs (Translation Environment Tools) available on the market. Previously I only had very brief and superficial experience with older versions of memoQ, so when memoQ 2014 was released on June 2014, I was excited to take the opportunity and test it for a few months as my main production environment.

Being a relatively experienced user of Studio 2014, my original plan was to point out the similarities and differences in features and approach between the two TEnTs, but Emma Goldsmith had a similar idea and compiled an exhaustive side-by-side comparison of memoQ 2014 and SDL Trados 2014, doing a much better job than I ever could. Her comparison is a compulsory reading for anyone interested to learn about the similarities and differences between memoQ 2014 and SDL Studio 2014.

Instead, I decided to do a more general review of memoQ, starting with some of the major new features in memoQ 2014, continuing with some general features and approaches I like in memoQ, and concluding with my brief impression of memoQ 2014 after about three months of using it as my main production environment.

Note and Clarification

memoQ, SDL Studio, and most other TEnTs share the same basic concepts and workflow, which will not be covered in this post. I’m also not trying to determine whether memoQ is better or not compared to SDL Studio or any other TEnT, not least because I don’t believe in such a universal determination. The definition of ‘better’ always depends on personal preference, needs, and circumstances.

Migrating data to memoQ

Coming from SDL Studio, the first thing I had to do was migrate some of my translation memories and termbases. I previously wrote about TEnTs interoperability, and because both SDL Studio and memoQ support standardized file formats for data exchange, I knew that this isn’t going to be a problem. Note that metadata (such as customized fields in TMs or Termbases) that is not part of the standardized formats specification might be lost in the process, but for me it wasn’t a big issue because I try to keep my data structure as simple and universal as possible.

Some of the New Features in memoQ 2014

menoQ 2014 is a major release that introduced many (according to Kilgray more than 70) new features and improvements, so I thought to start by going over those I think are most significant.

Project Templates

This is arguably the most significant addition in terms of workflow. Project templates automate the project creation process for recurring or running projects. There is no longer a need to reattach the same translation resources (TMs, Termbases, LiveDocs corpa) and other light resources (i.e. project-related settings) when manually creating the same project time and again, nor to maintain container projects.
SDL Studio 2009 has introduced a similar concept, and over the years I grew to relay on project templates more than I have initially expected. To be honest, I would have probably felt restricted in memoQ without this newly added feature.

The new project template creation windows in memoQ 2014

Furthermore, the project template feature introduced a new setting – Automated Actions – that can be used to setup certain automated actions that will take place during project creation, file importing, and wrap up.
A project template with automated actions offers a great deal of flexibility, and it can be as minimal or elaborate as needed.

The new Automated Actions options in memoQ 2014

Another new concept under the project template framework is the introduction of two separate translation memories: The Master and Working TMs. The Master TM is used for pre-translation and as a reference, while the Working TM stores all the confirmed segments in that specific project. Having two TMs can be useful, for example, for sharing only the project-specific TM without having to extract the relevant entries from a larger TM; or for using the Working TM as a buffer that can be edited and manipulated as needed without affecting and cluttering the Master TM, but if the experience of SDL Studio – that introduced a similar concept (Main and Project TMs) – is any indication, this separation can also create a great deal of confusion. It would be interested to follow how this concept is received and used by memoQ users.

New Translation Tab

The Translation Tab in memoQ 2014 has been revamped to better display the most important information. A new graphical progress bar was added, the filtering and sorting options were improved, and two new buttons: Structure and Details were added. The Structure button is used to switch to a hierarchical structure view of the project files – including any embedded objects and images (more on this shortly), while the Details command brings up the details pane with a summary of project or file information.

The new Translation Tab redesign in memoQ 2014. Note how the embedded Excal sheet and image show up as separate files

This is a nice redesign that presents the user with the most important information at a glance without digging for it.

New Tag Management and Handling

In addition to the traditional way of inserting tags, memoQ introduced the Tag insertion mode (keyboard shortcut F6) for inserting or removing tags from a segment. Sometimes there is a legitimate reason to remove tags from the a segment, and doing this directly in the editor environment without having to make the changes in the source document makes life so much easier.

memoQ 2014 also introduced the Arrange Tags command that automatically arranges the tags in the target segment according to their order in the source segment. This is a nice and easy fix for those occasions in which the tag order gets out of hand.

Automatic Extraction of Embedded objects, Image Localization, and the new PhotoShop Text Filter

This is another significant workflow improvement. memoQ 2014 can directly extract and import any supported embedded file format, thus eliminating the need to extract those embedded objects manually and handle them separately. There is one limitation to keep in mind, though. Currently only objects that are one level deep in relation to the original document can be imported, but this shouldn’t be an issue in most cases.

memoQ 2014 uses a similar technique to handle embedded images. The images are extracted and prepared as a localization package that the user can review, transcribe, add comments or instructions, and when the translation is done automatically re-import back to the project.
I usually don’t work with images, but I played around with this feature and it seems to work.

The new PhotoShop text filter allows to import PSD files and translate their embedded text layers. I usually don’t work with PSD files, but any filter that directly import content into the translation environment is a welcome addition.

Editing Time Report

I’ve long since argued that a change is needed in the the way translation fees are discussed. As independent professional service providers we sell our time and expertise, and therefore I argue that the most important metric to have is hourly earnings, and that the hourly earning should remain consistent across different workflows and work types. The key here is to accurately estimate how much time a project is likely to take, and use it to determine the project fee that in turn can be converted to whatever base unit one is most comfortable working with as needed.
Time measurement is also a key metric in evaluating how changes in the workflow actually affect productivity, and for verifying that the project progresses as expected.

For these reasons, I’m happy to see that a time measurement mechanism is implemented into more TEnTs (in SDL Studio this functionality can be added through the Studio Time Tracker Open eXchage app). I might have done something wrong, but while testing this in memoQ the results didn’t seem to be very accurate, and that is a little unfortunate. However, this is a newly introduced feature and Kilgray has a proven track record of refining and improving features that were a little crude when first introduced.

Honorable Mentions

A better TaaS (Terminology as a Service) integration: The cloud based Terminology platform. Personally I’m not too keen about cloud services, but others seems to like them and I heard that TaaS has a great term extraction capability.

Existing term warning: Now a warning is displayed when trying to add a term that already exists in the Termbase.

Duplicates removal from TMs and Termbases: The TM and Termbase editors can now automatically remove duplicate entries. Any automation of this kind is always a welcome addition.

Rename projects and resources: Projects and other resources (TMs, TBs, etc.) can now be renamed. I would argue that a frequent use of this feature is probably an indication of an underlying inefficiency in the workflow that needs fixing, but it could come in very handy for the odd occasion when renaming is needed without breaking up the entire project.

Joint view of files with preview: The preview function now works with Views.

General Features I Like about memoQ (not specific to memoQ 2014)

The following is a summary of general features and approaches I found useful and interesting in memoQ

The LiveDocs corpus

I like to refer to a LiveDocs corpus as a huge contextual reference repository, and find it particularly useful for running concordance search.

I don’t use a the project management capabilities that most commercial TEnTs offer, so once I’m done with a project I usually remove it from the project list. In this scenario, sending the completed project to the LiveDocs corpus before removing it from the list serves as a form of archiving.

Document Versioning

A memoQ project can matintain different versions of the same document. This is quite a powerful feature for managing new versions of a previously translated documents, especially updating revisions of documents in highly regulated fields such as legal and pharmaceuticals. Using the the X-translate feature, the translation from the previously translated version of the file is applied to the new version, bypassing the TM entirely, thus making sure that the new version will not be populated with statistically correct but otherwise wrong segments.

However, versioning works only in a project (I think), and with the introduction of the project template based workflow that should make container projects redundant, I’m not sure how this will work.

Views

A View in memoQ is a collections of segments from some or all project documents. In its most basic form it is used to virtually combine (or ‘glue’) several project files (similar to the Quick Merge feature in Studio 2014), but more complicated views can be created to split documents, extract repetitions or create a view containing only certain segments with specific attributes, such as errors, status, comments, etc.The Create View window in memoQ

Cascading Filters

The flexibility of running a second layer of filtering can save time and effort.

Export path rules, Placeholders, and Folder locations

memoQ gives the users control over the folders and files locations, and the naming conventions. The default settings work well – especially for the default workflow – but the ability to to customize the names and locations is greatly appreciated.

Self-contained

memoQ is (almost) independent of external third-party libraries. The immediate benefit of this is that there are only very few things that can get broken outside of Kilgray’s control, and in the long term I believe that it contributes towards are consistent and uniform user experience.

Monolingual Review

This is one of the features I was really excited about. Before finalizing a project I like to review the translation in its native format. I never found the sentence/paragraph segmentation restrictive, yet reviewing the translation in its native format gives a different perspective that for me helps in ironing out minor kinks and putting the finishing touches. The challenge with this workflow is carrying these edits back to the editor environment without having to do them twice. As an acceptable workaround I usually use some kind of preview mode to review the translation, while making the edits in the editor environment. A little clunky, but generally works.
This is why I was really excited to try memoQ’s Monolingual Review feature that was first introduced in memoQ 2013 R2, and while not perfect, I’m happy to report that it works well and is a real time and energy saver.
Monolingual review can also be used for implementing edits by an external editor in the translated document.

My General Impression of memoQ

After mere three months of use I’m not even near an expert memoQ user. I still have lot to learn, and lacking some historic perspective about memoQ’s development from a user’s point-of-view, I might be wrong about some things or go about them the wrong way. If I am, please feel free to correct me in the comments.

I particularly like how the defaults work well out-of-the-box, but memoQ offers many customization options to accommodate different workflows.
I was also happy with the relatively small size of the software, and its performance.

I did find memoQ’s user interface to be a little cluttered, and the project folder structure obscure, but these are minor issues.

Overall, I have found memoQ to be a very capable, flexible, and robust TEnT; and I enjoy using it.

]]>4Shaihttp://translationtherapy.com/?p=3572017-12-26T02:26:30Z2014-07-14T11:00:18ZCan emotional states be manipulated just by the use of emotion-evoking communication? This was the question that Facebook set to answer in a study that was published in the March 2014 issue of Proceedings of the National Academy of Sciences (PNAS) and erupted quite a controversy. The paper, named "Experimental evidence of massive-scale emotional contagion...

]]>Can emotional states be manipulated just by the use of emotion-evoking communication? This was the question that Facebook set to answer in a study that was published in the March 2014 issue of Proceedings of the National Academy of Sciences (PNAS) and erupted quite a controversy. The paper, named "Experimental evidence of massive-scale emotional contagion through social networks" (PDF file) describes a psychological experiment that Facebook had conducted during one week in January 2012 among 689,003 of its users to test how the amount of negative or positive content to which the users are exposed to affects their emotional state (i.e. mood), and their resulting behavior on Facebook.

In this blog article I don’t intend to discuss the controversy surrounding this research (here is the official reply of one of the researches to the controversy). This is and important topic that deserves discussion, but it outside the scope of this particular article.

Instead, I want to examine the results from a professional translation practitioner’s angle and suggest two lessons that I think we can learn from the study about the role of language in effective communication and social atmosphere setting.

The Goal of Facebook’s Research

Facebook’s research is based on the idea of Emotional Contagion, which can be defined as the ability to transfer emotional states to others, leading them to experience the same emotions as those around them without their awareness.
The occurrence of emotional contagion between individuals with in-person interactions is well established in the Psychology world, and with this experiment Facebook attempted to learn if emotional contagion can occur outside in-person interactions, i.e. if textual communication alone – as opposed to verbal communication with its nonverbal cues – is enough to form emotional contagion.

The Experiment

To test this, during one week in January 2012 Facebook used a tweaked version of its algorithm to reduce the amount of "emotional content" in the News Feeds of the 689,000 users who were randomly (and unknowingly to them) selected to participate in the research. Emotional content was defined as a post containing at least one positive or negative word, or as explained in the research paper:

Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) word counting system, which correlates with self-reported and physiological measures of well-being, and has been used in prior research on emotional expression.

The posts were classified as positive or negative according to the above criterion and then a certain percentage of them was removed from the News Feed so the user will be exposed to more positive or negative content, respectively.
The study design was quite simple, but because the general rate of positive (46.8%) and negative (22.4%) posts in Facebook is different, the researches had to use two separate control groups – one for the Positivity Reduced group and the second for the Negativity Reduced group. The control condition for the experiment was defined as the removal of the same proportion of posts from the News Feed as that of the respective test group (Positivity or Negativity reduced), only completely at random.

The Results

Simply put, the study results suggest that exposure to positive content evokes positivity among the readers, while exposure to negative content does just the opposite and evoke negativity among the readers. Or as explained by the researches in the paper:

When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

What can we Learn from Facebook’s Study?

As professional Translation Practitioners on whom the power of semantics and the role of linguistic subtleties in effective communication is not lost, the study results come as now surprise. They just further reinforce the fundamental role that effective use of language has in human communication.

The Importance of Writing Skills

The first lesson is a bit of an extension (or even extrapolation) of the research results on my part, but I couldn’t help but think about how often the importance of writing skills in translation is overlooked; how the emphasis and focus gradually shift from expertise and knowledge to rudimentary skills that can be considered prerequisites to the profession rather than qualifications, and/or even to superficial areas such as the workflow, and how misguided all of this really is.

In its core, every translation work is an interpretation (or adaptation) of the original, and as such it involves rewriting. This is one reason why even two excellent translators are likely to create different versions of the same source document, and while both versions might be good, they will also be different – sometimes even to a point that makes one version more effective than the other in a specific context and use. It is the subtleties in the linguistic and cultural adaptation, such as the choice of tone, style, semantics, and overall rhetoric, that are likely to make-or-break an effective communication.

Some in the translation space insist that translation is a technicality. A data, or big data problem. For them language is just a data set that can be transposed to a different data by transforming words and basic grammatical structures from one language to another. While it is unlikely that even those Language Transformation Mills with their unskilled and/or HAMPsTr-centric workflows will manage to turn a positive communication piece into a negative one (or vice versa), they can do even worse. By following this liner, one-dimensional, technocratic, and overly simplified approach, their transformation workflow can easily strip the communication from all the linguistic and cultural subtleties, otherwise known as the elements of effective human communication, thus evoking the emotion that every communicator fears the most – indifference.

The Importance of Balance

The second lesson that came to mind while reading the paper and thinking about Facebook’s research, was the importance of balance and boundaries.

There is a lot of negativity, disinformation, bad advice, sponsored content, and a general atmosphere of gloom-and-doom in many translation communities and publications. Coincidentally or not, many of them are also the major gateways to the profession and main sources of information for many translators. Facebook’s study should remind us just how easily behavior can be manipulated by the type of content one is exposed to, and alert us to how this knowledge can be abused by people who stand to gain from such manipulation.
This is not to say that we should shy away or retreat from discussing controversial topics or refute misguided claims and notions about our profession, it means that we need to be constructive about it and do it in the interest of education. Knowledge, as opposed to information (that can be selectively used to ‘divide and conquer’), is one important key for building one’s professional identity and viewpoint, and breaking out of the poverty and misery culture nurtured by so many.

In this day and age of information overload, it is important to remember Sturgeon’s law, and filter out the noise and the time-wasters-and-energy-drainers. It is important to maintain balance and not let just about any time-waster-and-energy-drainer out there to command the same time, attention, effort, and influence as true business partners with whom one shares professional and commercial values.

With all the (justified) controversy surrounding its inception and conduct, Facebook’s research gave us some valuable insights and reminder into the role and power of language in driving human emotions and behavior through communication and information (as opposed to knowledge) sharing.

Some Practical Advice

I want to conclude this article by sharing my recommendations for some knowledge resources and activities that can help in regaining focus. At times we all get discouraged (I know that I do) for some reason or other, but as Facebook’s research has demonstrated, breaking this cycle may be just one positive experience away.

Breaking the Gloom-and-Doom Self-Feeding Cycle

Phase out the ever-negative, FUD-driven fora and publications that seem to spread low quality information almost at random and nurture the poverty and misery culture. Replace them with sources that share knowledge;

Follow insightful people and/or those whose attitude you just like on Twitter and other social media;

Attend a local translation association meeting and/or international translation conference such as the Annual IAPTI International Conference; not least for the uplifting, feel-good, and recharging qualities that a couple of days of retreat have in general, and events that put the profession and its professional practitioners on a pedestal for a few days have in particular;

Connect with like-minded colleagues (in the conferences from the previous point, or on other fora) with whom you can develop professional relationships – and even friendships – overtime. True colleagues are never the competition!;

If you have any advice, resource, or general comment about the article that you would like to add or share, please let me know in the comments.

]]>2Shaihttp://translationtherapy.com/?p=3362017-12-26T02:27:10Z2014-06-29T12:00:34ZOne of the major complaints of many SDL Studio users is the lack of a "simple" terminology management module. MultiTerm, Studio’s terminology management environment, is a full-fledged terminology management module, but not everyone needs all the functionality it offers and some would have preferred a more simple and straightforward workflow that better suits their more...

]]>One of the major complaints of many SDL Studio users is the lack of a "simple" terminology management module. MultiTerm, Studio’s terminology management environment, is a full-fledged terminology management module, but not everyone needs all the functionality it offers and some would have preferred a more simple and straightforward workflow that better suits their more modest terminology management needs.

Furthermore, MultiTerm’s traditional workflow can be a little daunting at first. Creating, importing, or exporting a termbase (the file storing all the terminology information) are not necessarily as most intuitive and straightforward as some would have preferred. And if that wasn’t enough, while Studio’s Editor environment enables the user to add and edit terms in a termbase that was added to a project, the termbase itself must first be created in MultiTerm, which is and external software package to SDL Studio. This adds another layer of confusion about how SDL Studio and MultiTerm interact with one another.

In this article I want to discuss how two Studio 2014 Open Exchange apps (for accuracy, one of them is a actually a plugin) simplify the terminology preparation and exchange workflows, and why I think their significance to the Studio ecosystem is larger than the sum of their parts.

Enter the Glossary Converter Open Exchange app

The Glossary Converter Open Exchange app (please note that version 3.1 of the Glossary Converter is currently a special edition to raise donations for the St Wilfrid’s Centre, Sheffield and will become freeware again in October, 2014) by Gerhard Kordmann is a great little app that makes terminology information conversion between commonly used file formats as easy as drag-and-drop.
It is that good that since its release whenever I am approached with a technical question about handling terminology, my automatic reply is "download the Glossary Converter app". Whatever your terminology management trouble is, chances are that the Glossary Converter app can solve it.
Moreover, the glossary converter can convert bilingual Excel worksheets and CSV files into the Translation Memory eXchange (TMX) file format; or in other words, into a Translation Memory. Extremely useful.

Because the purpose of the article is to discuss the "bigger picture", I will not go into details about how to use the Glossary Converter app.
To learn more about the Glossary Converter app and see it in action, watch Paul Filkin’s video below. In the video Paul demonstrates how to use the Glossary Converter app and covers all the information a user needs to know about it.

Still, one issue remains. The Glossary Converter is an external app and there is no way to interact with it directly from SDL Studio.

Enter the Glossary Plugin

Gerhard Kordmann, the developer of the Glossary Converter app, has also developed a plugin that adds that missing functionality and enables direct interaction with the Glossary Converter app from the Projects view of SDL Studio.
Paul Filkin has made another great video that demonstrates where to find the Glossary Plugin ribbon group in Studio and how to work with it, so take it from here Paul:

While technically the Glossary Converter is still a separate software package, it is far less distracting than having to open MultiTerm itself, and its small size and intuitive interface make it very similar to any other typical software dialog window (project creations, settings, etc.) that users are accustomed too, and therefore it almost feels like an integral part of Studio.

Conclusion

While not being a fully standalone replacement for MultiTerm, for most practical purposes the Glossary Converter and Glossary Plugin form together what could be considered as the missing link between the traditional and "complicated" MultiTerm-centric workflow and a more simple, straightforward, and seamless workflow – that while still dependent on MultiTerm in its core – also makes it almost transparent to the user.
In my opinion, this is what makes the Glossary Converter and the Glossary Plugin tandem for Studio 2014 greater than the sum of its parts. Together they introduce a new, simplified, and seamless terminology creation and exchange workflow that did not exist before in the Studio universe, and from which – I would venture a guess – most individual Studio users can benefit greatly.Together, the Glossary Converter and the Glossary Plugin are are a must have addons for any Studio 2014 user.

The only concerns that I have are about fragmentation and cease of development. At the time of this writing there is no way to automatically update the Open Exchange apps While there is currently no way to automatically update the Open eXchange apps, the Glossary Converter app does check for updates on startup and notifies the user when a new version is available. This will reduce fragmentation, i.e. different users running different versions – some possibly quite outdated, but much like manual backups and any other repetitive background process that requires the user’s attention and intervention, manual updates probably rank somewhere near the bottom of the user’s daily priority list and could easily get skipped. Unlike other processes and workflows in life that some insist to automate for no real reason, this is exactly the type of process the benefits from automation and I would like to see some kind of automatic updates system the Open eXchange apps being introduced in the future.As far as cease of development goes, this is always a risk, and due to the reasons I’ve outlined above, in the future I would like to see SDL getting the rights to these two addons and implementing this functionality in Studio as a standard (and supported) feature.

]]>0Shaihttp://translationtherapy.com/?p=2892017-12-26T02:29:07Z2014-03-11T10:59:27ZInteroperability is a topic I took a special interest in since starting to use Translation Environment Tools (TEnTs). Most TEnTs store the data in proprietary file formats and that makes it that much harder to share or migrate information. One unfortunate results of this difficulty is the enablement of some unethical practices, and even more...

]]>Interoperability is a topic I took a special interest in since starting to use Translation Environment Tools (TEnTs). Most TEnTs store the data in proprietary file formats and that makes it that much harder to share or migrate information. One unfortunate results of this difficulty is the enablement of some unethical practices, and even more importantly, the creation of the feeling among users that they are held “captive” by the propriety formats and forced to use a certain tool over another regardless of their workflow needs or preferences, unless they are willing to spend time and effort applying workarounds that are almost never guaranteed to work, or worse, invest money in tools just for using their filters in the pre-processing stage. This resonates hard with me because I’m strongly against what I believe is a harmful, damaging, misleading, delusional, and near-sighted infatuation with technology that puts the technology before the human professional. I believe that the human professional is the most important element in any professional activity and that the technology is just there to help the professional as a tool. Therefore, the professional must be able to choose his or her tools by merit, experience, and expertise with as little as possible artificial obstacles influencing the decision.

In recent years quite a few advancements have been made in terms of TEnTs interoperability. It was probably promoted by the increased range of available TEnTs in the market and the emphasize that some developers have put into better standards supports and interoperability from the get go. Nowadays most modern Translation Environment Tools can exchange information via standardized file formats – primarily XLIFF (for bilingual files) and TMX (for exchanging Translation Memory information) – and some of them even offer native or extendable (via add-ons) support for reading and writing proprietary formats of other TEnTs.

In that regard it is worth noting that contrary to common belief, irritating as they sometimes are, proprietary file formats are not used just to restrict users; they allow the developers to extended the functionality of the standard file formats and add features that users need, rely on, and come to expect.It is not the ideal situation, and there is still a long way to go in terms of improved (dare I say complete?) interoperability, but we have come a long way since just even 5 years ago.

For example, MemoQ can natively handle Studio SDLXLIFF and SDLPPX (Studio Package file format), as well as WordFast Professional TXML files; OmegaT through the Okapi Filters Plug-in can be extended to support additional file types; SDL Studio file support can be extended by installing additional File Type Definitions from the Open Exchange platform; and other TEnTs such as Fluency and Déjà Vu also offer some degree of interoperability, but I don’t have enough experience with them to comment in detail. Since XML has become the de-facto format for storing and exchanging information, the modern TEnTs can create customized XML file definitions to parse virtually any XML-based file, even when no native or extendable interoperability exist. And to complement this improved interoperability and extendability, the information can also be exchanged via the standardized file formats.
The interoperability is not flawless, and exchanging information still not always as smooth as it should be, but we have come a long way, indeed.

A couple of days ago I helped a colleague setup a WordFast project in Studio 2014 and thought to share the experience as a short case study that highlights the process and the basic approach. This process can be used to add support for MemoQ MQXLIFF files, as well as any other file type available through the SDL Open Exchange platform.

Installing New File Definitions in Studio

From my experience at the time of this writing, the most common proprietary TEnT file types that exchange hands in the market are SDL Studio’s SDLXLIFF and SDLPPX (Studio Package), WordFast’s TXML, and MemoQ’s MQXLIFF. SDL Studio does not natively support proprietary file formats of other TEnTs, but its support can be extended by installing additional file definitions:

Once installed, the new file type(s) should appear in Studio’s File Types list accessible through the File > Options menu;

If the newly or previously installed file types are not listed, check if the message Additional installed File Types exist is displayed on the right pane of the Options window. Sometimes Studio software updates visually reset the File Types list, but no need to panic, the installed file definitions are all still there;

Although installed, the WordFast TXML and MemoQ MQXLIFF file types are not displayed in Studio’s File types list. The Message Additional File Types Installed indicates that there are additional file types installed that are not showing on the list.

Click the Additional installed File Types exist message and select all the required file types, and click OK to confirm;

Selecting the additional installed file types

Now, all the additional file types should be displayed in the File Types List.

The WordFast TXML and MemoQ XLIFF file types are now displayed in the File Types list and available in the Open and New Project menus

In this WordFast case study the text for translation was provided as a WordFast Professional TXML file, and after installing the TXML file definition Studio was able to read and write it without any issue. Note that Studio still converts the TXML into a SDLXLIFF file, so when the work is done make sure to save the file back in its original format by using the Save As Target… (Ctrl+F12) command or finalizing the project through the Batch Tasks menu.

Migrating Translation Memories

Modern TEnTs support the Translation Memory eXchange (TMX) format for storing and exchanging Translation Memory information. The TMX doesn’t support all the extended metadata that some proprietary TM file formats support, so when exporting a TM as TMX some metadata, such as custom fields and attributes, might be lost. However, TMX files generally make sharing and migrating Translation Memory information easier, especially if the TM doesn’t support proprietary metadata fields. Personally, I try to maintain my TMs free of any critical proprietary metadata. I know that some use the Client and Project fields to create some internal structure and separation within a large TM, but I always found it a bit restricting both in terms of my workflow as well as in terms of “future-proofing” the content; even more so nowadays that all modern TEnTs allow to use more than one TM (the historic limitation of SDL Trados 2007 or earlier) so the user can add as many many TMs as needed as reference. Another recommendation is to create an Exclusion file (in Studio, in other tools this information might be available in another form) when importing a TMX file. The exclusion file stores all the TM segments that failed to import for one reason or another, and makes it easier to get a quick overview of what was left out, to identify possible problem patterns, and determine if the failed segments are even worth stressing over.

Legacy Formats

Although they will naturally become less prevalent with time, it is still important to understand how to handle legacy TM formats. New generations of TEnTs can import information that was created their previous generation of technology; Studio for example can “upgrade” – i.e. convert – legacy SDL Trados TMs (in the TXT and MDW file formats) to its new SDLTM file format, but it cannot read TMs that were created by other TEnTs, unless they are in the TMX file format. Because past generations didn’t usually care much for open standards and interoperability, dealing with legacy formats could be a problem.

In this WordFast case study the TM was provided in the legacy TXT format of WordFast Classic. Although both WordFast Classic and SDL Trados 2007 or earlier can store the TM content in a plain text file, the data is structured differently and therefore Studio and WordFast Classic cannot read the legacy TMs of each other.
To overcome this obstacle I have used Olifant to convert the legacy WordFast Classic TM into a TMX file, and imported the TMX file into Studio:

From the File Types dropdown menu at the bottom of the screen select WordFast TM Files (.txt), and then select the TM you want to convert;

Once the TM is opened in Olifant, quickly go through it to make sure that there are no major obvious character encoding issues (a potential problem when dealing with plain text files, especially for non-Latin languages), and then use the File > Export command to save the content in a TMX file.

The WordFast legacy .txt TM opened in Olifant. For demonstration purposes the source and target segments are the same.

In this WordFast case study only one out of approximately 3,000 segments filed to imported. When consulting the Exclusion file I’ve immediately noticed that this segment contained Gibberish character in the target segment, probably due to some encoding mishap somewhere along the life cycle of the original TM, and it was certainly not something worth stressing over trying to fix.

Now, with the new file definition installed and the legacy WordFast Classic TM safely imported into Studio, what seemed at first as a discouraging project that will take some doing and “Tools-hopping”, turned out to be quite simple to setup and complete in full within Studio’s Editor Environment, which in this case was the TEnT of choice.

Terminology

Sharing or migrating Terminology could be a bit more complicated because the terminology management modules of the various TEnTs not all support a standardized file format for exchanging terminology with all of its metadata (custom fields and structure). In terminology work, especially with more complex structured glossaries/term-bases, the medtadata could be a critical component, so the lack of of full metadata support require some more careful attention and planning before exporting and importing that information. With this limitation in mind, the safest and probably easiest way to share and migrate terminology information is via a delimited file type such as a Comma-Separated Values (CSV) file. For exchanging simple structured glossaries (i.e. a source term field and a target term field) sharing the information via a delimited file works perfectly.

In this WordFast case study there was no extenal glossary provided, but I used this opportunity to convert my colleague’s own glossary stored in a two-column Excel spreadsheet to a MultiTerm termbase. Because it was stored as a two-column spreadsheet the information could have been easily converted into a MultiTerm TermBase using the MultiTerm Converter Tool, but to simplify the process even more I opted to use the Open Exchange application Glossary Converter. The Glossary Converter application automatically converts a terminology file (supporting xls, xlsx, csv, txt [tab-delimited], TBX, UTX and even TMX file types) into a MultiTerm TermBase, and vice versa, by simply dragging and dropping the file onto the application window.The main window of the Glossary Converter application
It cannot be any easier.

Conclusion

Interoperability has come a long way in recent years, but still has a way to go. The improved interoperability is a very important development because it removes or at least minimizes some of the artificial obstacles, barriers and limitations that users have faced and that, in part, contributed to the unhealthy focus on technology. Technology is just an enabler, not the destination.

In this article I offered a short overview of TEnTs interoperability, as well as Studio-specific short case study that describes how a confusing and somewhat discouraging project that involved files that were created by different generations of another tool has turned into a straightforward Studio project.To end this article on a positive note, my colleague reported that everything went smoothly and flawlessly, and that the project was a complete success.

]]>1Shaihttp://translationtherapy.com/?p=2332017-12-26T02:30:19Z2013-12-04T18:17:55ZThe release of Studio 2014 brought about some great new features and performance gain that to me finally make Studio a mature Translation Environment Tool, but not everything went completely smooth as soon afterwards along came the now infamous Java Update 45 that broke pretty much the entire terminology editing functionality in Studio and MultiTerm...

]]>The release of Studio 2014 brought about some great new features and performance gain that to me finally make Studio a mature Translation Environment Tool, but not everything went completely smooth as soon afterwards along came the now infamous Java Update 45 that broke pretty much the entire terminology editing functionality in Studio and MultiTerm 2014. With the recent release of Cumulative Update 2 this incompatibility between Studio and MultiTerm 2014 and the latest Java version has been resolved, at least for the most part, but another nuisance that loomed over Studio 2014 since its release remains unchanged: the controversial delay when trying to add or edit terms in a termsbase.
This delay is quite the source of controversy among users, with some even claiming that it is a productivity showstopper that prevents them from switching to Studio 2014 or forces them to revert to a previous version of Studio. I sympathize with their sentiment and can understand how these short delays can add up with time, especially for those who do a lot of terminology editing.
In an attempt to better understand what causing this delay and to find out if it could be avoided, I decided to investigate (within the scope of my limited knowledge on the subject) further into this issue, and came up with what I consider to be a satisfactory workaround for most users.

Java Security Checks

Before talking about any potential workaround or solution it is important to understand what is the cause for this delay in terminology editing in Studio and MultiTerm 2014.
With Studio and MultiTerm 2014 SDL has adopted and adapted to Oracle’s (the company developing Java, which is the software platform that MultiTerm uses) security concept. Under the requirements of this security concept the components need to be digitally signed so that their validity could be verified. This is a basic and common security concept in the computer security world, used to minimize security risks and mitigate vulnerabilities by preventing malicious developers from posing as trusted entities in an attempt to execute their malicious code in the systems of unsuspecting users. In other words, the validity of a software component must be verified before it is allowed to run, and this verification process is the cause for the terminology editing delay in Studio and MultiTerm 2014. Although this delay should be most noticeable on the first editing attempt in a session and much shorter afterwards, some users (me included) report a noticeable-enough delay also in the subsequent editing attempts.

It should be pointed out that this security check is not just a random, arbitrary step that slows everything down for no reason. The benefit of adhering to the requirements of Oracle’s security concept is a more robust Java integration that should reduce the amount of security prompts that a user experiences and the required number of workarounds to be applied that for some users have historically been a serious security and/or productivity problem – and just as much of a nuisance as this delay – and overall improve the stability and reliability of Java operations within Studio.

Personally I think that conforming to Oracle’s security concept is a good thing as long as Java is used. This relatively short delay in terminology editing is generally an acceptable price to pay in the interim for achieving a more stable terminology working environment, but when doing a lot of terminology editing work these little delays do add up and there is no denying that they impede productivity.

I knew that the security check is most likely the cause of this delay, and because I do not know a whole lot about Java I set to investigate this subject a little further in an attempt to learn how they work. Although I have learned quite a lot in the process I do not pretend to be an expert on Java, so the following explanation is just a quick and superficial overview of this quite broad subject.

As mentioned above, to improve security and stability the Java components that Studio uses are now digitally signed. This signature is called a certificate (think of it as an Identity Card) that can be set to expire after a certain amount of time, or on-demand. Before any Java component is allowed to run in the system, the validity status (called revocation status) of its certificate is verified to make sure that it is what it claims it is, and that its certificate is still valid and has not been revoked for any number of reasons. If the component passes this verification process it is allowed to run; if it fails, a security prompt is displayed to the user and/or the component is allowed/disallowed to run, all depending on the system Java security settings.

The Java platform uses two mechanisms for performing this check:Certificate Revocation List (CRL): a Certificate Revocation List, or CRL for short, is a list of certificates that were issued by an issuer and since have been revoked (i.e. expired). This is basically just a simple file that is downloaded and cached in the local system.Online Certificate Status Protocol (OCSP): An internet protocol, and an alternative to the CRL method, used for obtaining the revocation status of a certificate by sending a request to an OCSP server that returns the revocation status of the certificate. This mechanism is generally considered preferable over the CRL method because it always has the most up-to-date information, its more resource efficient, and generally considered more secure.
Recent Java versions have both these methods enabled by default, and from my investigation and little experiment on my system, the OCSP method seems to be the culprit causing the delay. However, I’m not sure exactly why or where things could possibly be optimized.

The Workaround(s)

In this article I’m suggesting two workarounds for disabling the OCSP security check, and in more extreme cases (that should be rare), both security checks; thus considerably shortening the delay. However, it is very important to note that the resulting performance gain is achieved by sacrificing some security, although.
That said, for the average Studio and MultiTerm 2014 user in a Small or Home Office environment, where Studio and MultiTerm are the only (or at least the major) Java-dependent software being used, and one that follows a common sense and some basic computer security best practices (such as keeping the Operating System and program up-to-date, not installing or opening files from unknown sources, avoiding questionable websites, etc.), the security compromise involved with this workaround is quite acceptable and well within tolerable margins, and for many users the performance gain probably outweighs this risk.

Disclaimer: Please note that this workaround is intended for users maintaining their own system and/or local IT infrastructure. It is NOT intended for users in an organization or other work environment in which their system and IT infrastructure are managed by a dedicated IT team. In such case please contact the person responsible for maintaining your system. It should also be noted that I don’t take any responsibility for any issue that may arise from applying this workaround. Therefore, proceed at your own risk and understand that while I did successfully test this workaround on my system, there are far too many system-specific variables to account for, so your mileage may vary.

This is the preferred workaround because for most average users the resulted security compromise is almost negligible.

Open Windows Control Panel;

Click the Programs icon and then the Java (32-bit) icon to launch the Java Control Panel;

Switch to the Advanced tab

Scroll down to the Check for certification revocation using settings group;

Select the Certificate Revocation Lists (CRLs) option;

Confirm to close all windows and exit the Java Control Panel;

Restart Studio and/or MultiTerm 2014, as appropriate.

The Advanced tab in the Java Control Panel showing the Check for certificate revocation using settings group

<

h3>Secondary Workaround: Disabling Revocation Checks

If the preferable workaround described above did not work as expected, it is possible to disable both revocation checks altogether. Note that this workaround results in a bigger security compromise, but for the average user it is still considered minimal and acceptable.

Open Windows Control Panel;

Click the Programs icon and then the Java (32-bit) icon to launch the Java Control Panel;

Switch to the Advanced tab

Scroll down to the Perform certificate revocation checks on settings group;

Select the Do not check (not recommended) option;

Confirm to close all windows and exit the Java Control Panel;

Restart Studio and/or MultiTerm 2014, as appropriate.

The Advanced tab in the Java Control Panel showing the Perform Certificate Revocation Checks On settings group

Conclusion and a plea to SDL

SDL Trados and Studio have a long history of Java-related issues. Even thought they usually stem from the Java side of the equation – which is not under SDL’s direct control – SDL responsibility lies with their decision to use Java. This last terminology delay issue serves as an excellent example for why I [since long] think Studio should get away from Java; while this issue is not a problem from a technical standpoint (because there is nothing to fix really, this is just how the security mechanism works) it is very much so from a User Experience perspective and SDL will do right by addressing it from that angle instead of the more traditional and obvious technical one.
While most of the time these Java issues are eventually resolved by a patch or workaround, it is almost always a little too late as far as many users are concerned. I strongly urge SDL to drop Java altogether, which I think will enhance the User Experince, and potentially also the Secutiry and Stability in the process.

]]>8Shaihttp://translationtherapy.com/?p=2042017-12-26T02:31:20Z2013-10-25T17:08:24ZOne of the most annoying and stressful problems to encounter after processing a document in a Translation Environment Tool is finding out that the target file cannot be created due to some obscure and vague error. This type of scenarios is relatively common – it has actually just happened to me – so I thought...

]]>One of the most annoying and stressful problems to encounter after processing a document in a Translation Environment Tool is finding out that the target file cannot be created due to some obscure and vague error. This type of scenarios is relatively common – it has actually just happened to me – so I thought that this would be a good opportunity to write a short article about this issue.

This error usually stems from missing tags, tags that were mishandled by the TEnT, or problems in the underlying structure of the source file.

The good news is that there are best practices to follow for diagnosing these issues in a timely manner before the work starts, preventing them during the work, and solving them even after the work has been completed. There are no guarantees because there are almost an infinite number of project-specific parameters, including the file type and the specific TEnT with its unique quirks, that vary between projects, but in most use-case scenarios following these best practices will save a lot of time, frustration and stress at the end of the project.

I’m using SDL Studio 2014 as my main TEnT and therefore will focus mostly on working with it, but these best practices should apply to all other TEnTs, with the necessary adjustments to account for each tool workflow and intrinsic “quirks”.

Best Practices for Making Sure That the Target Document Can be Created at the end of a project

Diagnosis

The first best practice is to try and save the target document right after it was opened in the TEnT. In SDL Studio this is invoked from the Editor view by pressing the Shift+F12 keyboard shortcut for via the File > Save Target As menu command. In case of any structural issues in the source file that prevent saving the target document in its original format, they will be discovered at this point, which is the best timing during the project to learn about them because there is enough headroom to perform all the necessary corrective actions and adjust the project timetable and/or fee as necessary.

If the target file was created successfully it means that the structure of the original file is intact, and any further error is most likely associated with missing inline tags, an issue that can be easily corrected.

Prevention

To avoid accidental deletion or omission of inline tags it is recommended not to work with a visual editor (the one displaying the text in its final formatting without the tags that control that formatting) and set-up the TEnT to display all tags instead.
In SDL Studio:

Click File > Options;

Select the Editor sub-menu from the left pane;

From the Formatting display style dropdown list, select Show all formatting and tags;

Click OK to confirm and exit.

It is important to note that the amount of tag information that is displayed for each tag In the Editor windows can be controlled, as explained by Paul Filkin in his article Simple guide to working with Tags in Studio which I recommend reading to better understand how to work in tags. Displaying the full tag text can come in handy when needing to understand what formatting a tag represents, while still keeping all tags visible.

Corrective Actions to solve an Error when Trying to Create the Target Document

If you followed the first best practice (trying to create the target document after opening the file in the TEnT) and an error occurs, it means that there is an issue with the structure of the source file that must be resolved before the translation work starts. It is important to pay attention to the error details because sometimes they can point to the source of the problem and allow a more targeted approach for fixing it.

The following corrective actions are displayed in the order that I think is the recommended procedure to go about solving a general and unspecific target document creation error, but they can be used in any order.Note: Before stating to do anything, create a backup copy of the original file. If the error occurred at the end of translation, make sure to save the Bilingual file and update or create a TM to use later with the new source file.

Open and Repair the Document

From my experience most of the more persistent errors of this type seem to be associated with Word documents. Because these issues usually stem from a structural error in the source file, trying to repair the file using Word’s Open and Repair command is a good starting point:

Open Microsoft Word;

Click File > Open and navigate to the source file;

Instead of clicking Open, click the down-arrow in the Open button and select the Open and Repair command.

Saving the Source File in Another File Format

All word processors file formats support basic and common text formatting such as Bold, Italics, Underline, Color, etc., but use a different underlying structure for storing the data. Saving the original file in another file format – specifically one that supports less “complex” features – can remove some common underlying structures that are known to get broken or mishandled, while still maintaining all or most of the visual formatting. The recommended file format to use is the RTF (Rich Text Format) file format. Generally it is considered safe converting the original file into RTF without worrying too much about losing key formatting or functionality.

If the RTF conversion doesn’t work, I also suggest to try saving as DOC and DOCX file formats (in Microsoft Word and depending on the original file format), or as ODT if using another word processor that support this file format.

Now, try to import or open the converted file in the TEnT and create the target document.

Removing all Bookmarks from the Document

Word documents can contain bookmarks such as Table of Contents and Cross-references that are sometimes get broken, or mishandled by the TEnT. SDL Studio is quite known for mishandling some bookmarks at times (at seems a bit at random), so this clause is specifically relevant to Studio users.

The solution in this case is to remove all bookmarks from the original document and then process it again in the TEnT. If these bookmarks are important for the functionality of the file they need to be added after the target has been created.

There are two ways to remove the bookmarks from the document:1. Manually
If there aren’t many bookmarks in the document and/or if after consulting the error message details you know which bookmark is likely to be responsible for the error, it might be more effective to remove the bookmarks one-by-one until the culprit is identified.

To display the bookmarks list:

Select the Insert Ribbon tab in Microsoft Word;

From the Links Ribbon group select the Bookmark command;

Select the Hidden bookmarks checkbox at the bottom of the windows to display all bookmarks in the document;

Select the required bookmark and click the Delete button to remove it.

2. Using a Macro to Automatically Remove all Bookmarks from a Document
If the document has a lot of bookmarks it is not very practical to remove them manually. Fortunately, a Macro is available to remove all the bookmarks (including the hidden ones) at one fell swoop. The following Macro code is taken from Microsoft’s Support website.The Code:
Sub StripAllBookmarks()
Dim stBookmark As Bookmark
ActiveDocument.Bookmarks.ShowHidden = True
If ActiveDocument.Bookmarks.Count >= 1 Then
For Each stBookmark In ActiveDocument.Bookmarks
stBookmark.Delete
Next stBookmark
End If
End Sub

When working with DOCX siles, the Enumeration already finished error is a Special Case of the general Failed to Save Target As error. This error is most likely the result of AuoText Entries. The following instructions on how to delete the AutoText Fields are based on the Delete an AutoText entry article (that refers to Word 2003) by Microsoft, but refers to Office 2010 and 2013:

On the Insert tab, select the Quick Parts menu under the Text group, and then the Building Blocks Organizer… command.

Sort the entries by the Gallery column.

Click the name of the AutoText entry that you want to delete and then the Delete button.

Note: Deletion of an AutoText entry cannot be undone. The only way to restore an AutoText entry is to create it again manually. Therefore, use with caution.

Due to the possible issues that deleting AutoText entries could have on other or future documents, a more safe approach in my opinion is to save the original document in a DOC format, translate it, and in the end save it back as DOCX; if the error occurred after the translation is already complete, save the file in a DOC format and re-translate it using the existing Translation Memory./p>

Running the tool’s built-in Tag Verification

If you forgot to test save the target document before starting the work only to find out that an error occurs at end, or if the target document was created successfully at the beginning of the work but now an error occurs, the first action to take is running the TEnT’s built-in Tag Verification function and correcting any tag-related issues that it identifies.

In SDL Studio the Tag Verification is invoked by pressing the F8 Keyboard shortcut, or from the Review Tab > Quality Assurance group > Verify command.

Note: Some formatting tags like font color, Italics, Bold, superscript etc. are sometimes omitted by the translator on purpose. However, if a target saving error occurs, make sure to add all the missing tags that were identified by the Tag Verification function; even those that were removed intentionally (just add them to the appropriate segment but don’t enclose any text within them).

Conclusion

Not being able to create the translated document after processing it using a TEnT is quite stressful because there is no clear corrective action to take and the deadline and/or start of the next project are nearing.

In this article I have attempted to offer several best practices for diagnosing this error at the beginning of the project to allow for any resulting adjustments to the timetable and/or fee for correcting the issue, preventing accidental deletion of tags during the transition process, and some recommended corrective actions.

]]>7Shaihttp://translationtherapy.com/?p=1872017-12-26T02:32:42Z2013-10-21T17:29:08ZUpdate: on November 27th, 2013 SDL has released ‘Cumulative Update 2’ that addresses the Java incompatibility issue. For more information go to the Update November 27th, 2013: Cumulative Update Released section below. Update number 2: SDL has published a new Knowledgebase article (KB article No. 5060: Troubleshooting terminology issues in SDL MultiTerm and SDL Trados...

Update number 3: A source at SDL confirms that they do plan to get away from Java, and a new or revised terminology module will be released sometime later on 2014. I don’t know yet if it will be a full-fledged MultiTerm replacement (less likely), or a more basic terminology module (what most users need anyway), but that is good news.

Recently, Studio 2014 users were prompted to update their Java 7 version from update 25 (that comes installed with Studio 2014) to update 45. Studio and MultiTerm always had sporadic issues stemming form MultiTerm’s reliance on Java, but to my best recollection this is the first time that a Java update breaks the core functionality like that.

The workaround for solving this problem is quite simple and based on removing the new 45 update and reinstalling the old 25 update until SDL will release an update to solve this issue once and for all.

The Workaround: Installing the old Java 7 update 25

First Method: Manual Installation

Open Windows Add/Remove Programs (Windows XP) or Programs and Features (Windows 7 and 8);

From the programs list locate Java 7 Update 45 and uninstall it (as well as any other instance of Java that appears there);

Install the update; make sure to deselect the checkbox for installing the Ask Toolbar that comes up in the second windows of the installation wizard (unless you really want to install the ask toolbar, and generally you don’t).

Second Method: From within Studio 2014

From the program list locate Java 7 Update 45 and uninstall it (as well as any other instance of Java that appears there);

Restart the system (this is not mandatory but recommended);

Start Sutio 2014, open or create a project and add a Termbase to it;

From the Editor window attempt to add a term to the termbase;

A dialog box will pop up asking if you want to install Java 7 update 25 (make sure that is says update 25);

Confirm and install the update. When you go through the installation wizard windows make sure to deselect the checkbox for installing the Ask Toolbar that comes up in the second window (unless you really want to install the ask toolbar, and generally you don’t).

Lowering Java Security Settings

If the error An error has occurred in the script on this page, or similar message that refers to an error in a script, has occurred when trying to add a term to the Termbase even after rolling back to Java update 25, lowering the Java security settings usually solves this:

Open Windows Control Panel > Programs;

Click the Java (32-bit) icon to launch the Java Control Panel;

Switch to the Security tab;

Lower the security setting bar to the High or Medium setting, until the error doesn’t appear anymore and MultiTerms works as expected.

Click OK to confirm and exit.

Java Security settings tab

Finishing up: Disabling Java Automatic Updates (not mandatory)

To prevent automatic or accidental Java updates it is best to disable Java automatic update until this issue will be officially resolved.
This is not a mandatory step, just a recommendation.

Open Windows Control Panel > Programs;

Click the Java (32-bit) icon to launch the Java Control Panel;

Switch to the Update tab;

Deselect the Check for Updates Automatically checkbox;

Click OK to confirm and exit.

Once the issue is officially resolved it is recommended to turn on automatic updates again by following the above steps, only now selecting the Check for Updates Automatically checkbox instead of deselecting it.

I will update this short article with the relevant information when the official patch or update is released.

Earlier today SDL has released ‘Cumulative Update 2’ that brings Studio 2014 to version 11.0.3688.0 and among rest solves the incompatibility issue between MultiTerm and Java 7 update 45. I’ve updated my version of Studio and Java to test it, and it seems to indeed solve the problem.

Downloading and Installing Cumulative Update 2 (or above if you stumble upon this article at a later time)

If Studio’s Automatic Updates function is enabled but you were not prompted to install this update upon starting Studio, proceed to install the update manually: Switch to the Help tab on the Ribbon, and from the Actions tab group select the Check for Updates command. Follow the on-screen update wizard to complete the update.
This Cumulative Update is also available as a standalone file from the Download section of SDL’s Hotfixes and cumulative patches for SDL Trados Studio 2014 Knowledgebase article linked above, but unless there is a reason to do otherwise, it is recommended to update Studio using its built-in updater mechanism.

To update Java to the latest version, go to the Java download section and download and install the latest version.
Alternatively, update Java from the Java control panel:

Open Windows Control Panel > Programs;

Click the Java (32-bit) icon to launch the Java Control Panel;

Switch to the Update tab;

Click the Update Now button (bottom right);

If you previously disabled Java Automatic Updates to avoid accidental installation of Update 45, it is generally a good idea to turn Automatic Updates back on by selecting the Check for Updates Automatically checkbox;

If you had to lower your security settings to avoid Java script errors, it is also recommended to switch to the Java control panel Security tab and set the security level to High (if the errors persist, lower it back to Medium);

Click OK to confirm and exit.

Unfortunately, at this point there is still no official update to solve this issue for Studio 2011. There is an unoffical workaround to solve MultiTerm 2011 incompatibility issue with Java update 45, and although the risk associated with installing it is probably minimal (and in worst case scenario it could uninstalled), I would recommend waiting a while longer, as long as using an older Java version is not a considerable problem on a specific machine running Studio 2011.