When it comes to translating a different language, which one’s better? That was the goal of the Los Angeles-based firm Verbal Ink, who embarked on a challenge to find out whether Google Translate could provide the same amount of accuracy as a professional human translator. They compared the search engine super-giant with Adriana (a real-life translator), and the results were more than surprising. Here are 3 of the key findings;
1. Google struggles with certain concepts
When people use Google Translate, they expect The Big G to provide them with a fast and accurate language translation. Verbal Ink found that, for the most part, Google did exactly that. However, their translation service struggled when it came to understanding certain concepts – particularly those which are specific to a certain language or dialect. This sometimes had an effect on the overall meaning of a text.
So, what did the research conclude? Although a human translator can work out to be expensive than a free service like Google Translate, this study suggests that the former has a better understanding of the language used in an everyday context.
2. Google is great for basic language translations
Verbal Ink found that the service is great at providing the basics of the text, although Adriana scored points when it came to overall interpretation and accuracy. In the study, professional translator Gaby V. found that, when compared like-for-like, Google Translate churned out sentences that were “disjointed” in one example, with fractured syntax and poor use of grammar. Adriana, however, had no difficulty when it came to word choice or overall literal translation.
What have we learned? Google Translate is great for those who need a quick translation, but a professional translator might be more worthwhile if a complex document needs to be deciphered.
3. Google had difficulty with pronunciation
Verbal Ink’s research was based on two tests; the first of which involved comparing the translation of a marketing pitch in Spanish. The text was translated using Google and given to Adriana to work her magic. Google was able to convey the overall meaning of the text in English, although some clauses and sentences were difficult to read. The second test involved speech, and both Google and Adriana were asked to transcribe a speech spoken in Spanish, before translating this into English. Here Google had difficulty with some pronunciations and repeated words.
Who won this round? Well, the human translator had a better grasp of pronunciation and clauses. To see the Infographic and check out the audio and text files used for this experiment click here.
(448)