Omission and other sins: Tracking the quality of online machine translation output over four years

Susan Lotz, Alta Van Rensburg

Abstract


Online machine translation (MT) has empowered ordinary language users to have texts translated all by themselves. But are these users aware of the pitfalls? This article draws on a longitudinal study that explored the quality of output by online MT application Google Translate in the language combination Afrikaans–English. We investigated the distribution of errors in two sets of translations (slide-show text and news report text) that we had Google Translate produce annually over a period of four years, 2010–2013. Omission, Mistranslation, Non-translation and Grammar were error categories that scored high in the analyses. In addition, we found that although the quality of the translations seemed to improve up to 2012, the pattern of improvement levelled off, with some of the 2013 output containing more errors than that of the previous year. We believe users should be made aware of the risks they unknowingly take when using online MT.


Keywords


error categories, Google Translate, machine translation, mistranslation, non-translation, translation quality

Full Text:

PDF


DOI: http://dx.doi.org/10.5774/46-0-223

Refbacks

  • There are currently no refbacks.





ISSN 2223-9936 (online); ISSN 1027-3417 (print)

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License


Powered by OJS and hosted by Stellenbosch University Library and Information Service since 2011.


Disclaimer:

This journal is hosted by the SU LIS on request of the journal owner/editor. The SU LIS takes no responsibility for the content published within this journal, and disclaim all liability arising out of the use of or inability to use the information contained herein. We assume no responsibility, and shall not be liable for any breaches of agreement with other publishers/hosts.

SUNJournals Help