Wednesday, August 6, 2014

My niece asked me what my tattoo meant and I told her it was Chinese for Julie. She Googled the translation of my name, and well, it didn't match my tattoo. I've had it for 15 years, and I'd like to know what it actually means. I'm almost afraid to know! Thank you.

Thursday, July 31, 2014

Wow what a great blog, has got me wondering about the 3 random letters or what I though we're just letters put on my shoulder. I do remember a Chinese person laughing at it in a nightclub once. Would it be possible for you to translate.

I have had this tatoo on my upper back for about 10 years now. I was drunk when i got it. I want to say that the translation is supposed to mean "to honor father" "memory of father" something in that ballpark. I'm not really sure what language or dialect. Could you help me?

Tian, I have had these 2 tattoos since I was 18 and have no idea what they mean. They were supposed to say creative and crazy if I remember correctly. If you can decipher them and let me know that would be great.Thanks,Brian

Monday, June 23, 2014

from: John R.to: tiangotlost@gmail.comdate: Sat, Jun 21, 2014 at 5:03 PMsubject: Tattoo SubmissionTian,Been trying to get my brother to submit this for awhile and he finally gave me the go ahead. It is supposed to say "pure man" but one time someone told him it meant more properly something like 10/9ths Rice Man...and that maybe that was a colloquial expression for purity. Sounds fishy! Can you help us out please?Thank you!

THOSE passingly familiar with machine translation (MT) may well have reacted in the following ways at some point. “Great!” would be one such, on plugging something into the best-known public and free version, Google Translate, and watching the translation appear milliseconds later. “Wait a second…” might be the next, from those who know both languages. Google Translate, like all MT systems, can make mistakes, from the subtle to the the hilarious.

The internet is filled (here for example) with signs badly machine translated from Chinese into English. What monolingual English-speakers don't realise is just how many funny mistakes get made in translating the other way. Take, for example, the Occupy Wall Street protester in 2011 who seems to have plugged “No more corruption” into a computer translator and made a sign with the resulting Chinese output. It read: “There is no corruption”.

MT is hard. It has occupied the minds of a lot of smart people for decades, which is why it is still known by a 1950s-style moniker rather than “computer translation”. Older models tended to try to break down the grammar or meaning of the source text, and reconstruct it in the target language. This was so difficult, though, that in retrospect it is unsurprising that this approach started running into intractable problems. But now, in an early application of “big data” (before the phrase became vogue), MT systems typically work statistically. If you feed a lot of high-quality human-translated texts into a translation model in both target and source languages, the model can learn the likelihood that "X" in language A will be translated as "Y" in language B. (And how often, and in what contexts, "X" is more likely to be translated as "Z" instead.) The more data you feed in, the better the model's statistical guesses get. This is why Google (which has nothing if not lots of data) has got rather decent at MT.

If you "round-trip" the preceding paragraph in Google translate, rendering it into German and then translating that output once again into English, the errors and infelicities multiply:

Machine translation is very good in the translation of single words, where all she has to do, is to act as an online dictionary. It is also good at common rates, as these chunks, which translates many times and so easily represented in the target language. It's not bad, simple sentences with a clear structure enough, though, once you start sentences plugging in, you'll start to see some sluggishness in the output. And all the lyrics begin, in fact, look very disjointed.

MT struggles in particular with surprising input that the training model has not taught it to expect. Hanzi Smatter, a blog, received a picture of a biker who got a computer-translated “Ride Hard Die Free” tattooed in huge Chinese characters down his torso. The only problem was that he got "die" in the sense of a “tool used for stamping or shaping metal” permanently inked on his body, probably because nothing like “die free” was in the translator’s training texts. (It also translated “free” as “free of charge”.) Perhaps lots of industrial or commercial materials were part of the training, explaining why the rather less common “tool” meaning of “die” was chosen over the more common “ring-down-the-curtain-and-join-the-choir-invisible” meaning.

To rely on raw MT output is almost as bad an idea as getting a full-body tattoo in a language you don’t speak. But it would also be a mistake to dismiss MT, a steadily improving tool that is best used with human post-editing. This week in Dublin, TAUS, an idea shop and resource-sharing platform for MT users, gathered originators and users of MT to talk about how to get users to share more of their data. The more everyone shares, the more everyone wins, but many companies consider their translation models proprietary assets.

The reason companies have proprietary systems is because MT’s quality is quickly improved by specific training for a restricted domain. For example, an industrial company would train its model to translate "die" with the “metal tool” meaning, a toy-maker would prefer the “cube with dots on each side” meaning, and a pet shop would prefer the “pushing-up-the-daisies” meaning. Such domain restriction increases the accuracy of translation quite a lot. It has the down-side of making a single engine less useful for broader applications. But this problem is diminishing, since new such engines can increasingly be crafted quickly, as needed, for a given language pairing and domain (as long as enough training text is available, which is why TAUS is trying to get companies to share).

This makes MT a lot more than a quick “good-enough” translator or an aid to tourists. Wayne Bourland of Dell, a computer-maker, says that using MT, plus post-editing, has cut translation time by 40% for his company, which localises its website in 28 languages. More importantly, MT saves money: it has saved Dell 40% of its translating cost since 2011. He calculates the return on Dell’s investment for MT at 900%—numbers, in other words, to die for.

So will MT replace human translators entirely at some point? Or perhaps even replace the need for learning foreign languages in the long run? That will be the subject of the next column.

The peculiar thing is that the characters are all correctly and properly written in a nice font, but it is completely gibberish in meaning. Could it possibly have been machine-translated into Japanese from English or some other language and the MT output tattooed as-is onto the unsuspecting tattooee’s foot? You would think that maybe the untranslated lowercase letter “i” in the supposedly Japanese MT output would have clued someone in to the fact that the MT had failed, leaving just gibberish.

It reads:

のiスタンドの多く No i-sutando no ooku

まだ私は独りで歩ける。 Mada watashi ha hitori de arukeru.

The first line makes no sense but might mean something along the lines of:

Monday, April 7, 2014

A long-time friend of mine was a tattoo-monster in her formative years and sure enough, some sort of Asian script made its way onto her body! I just HAVE to know if it says what she thinks it says! Help! :)