Published on

What is Machine Translation?

Authors
  • avatar
    Name
    Antonio Castaldo
    Twitter

It comes often with difficulty explaining where my research field stands to non-experts, at the intercection of language and computer science. This does not come by surprise, considering we have always been taught that languege is an inherently human ability.

As language and technology continue to intertwine, we are increasingly aware of how powerful and helpful machines could be, if given the ability to use natural language. In this blog, I will share my knowledge and code so that other interested people may benefit in my passion for natural language processing and artificial intelligence.

We have come a long way since George Artstrouni, a georgian engineer, developed his first prototype of what he called a "mechanical brain" in 1932 1. Artstrouni's mechanical brain was a multilingual dictionary-based machine, capable of providing raw word-by-word translations.

Since Artstrouni's invention, we acquired the ability to translate from and into 200 languages and we are increasingly capable of translating creativity, figures of speech and figurative language. Of course, this was not before the discoveries in the field of neural networks, with incredible scientists and researchers such as Frank Rosenblatt with his Perceptron in 1958, Geoffrey Hinton with his work on backpropagation algorithms 2 and more recently, Google DeepMind with the world-changing research on Transformers 3 by Vaswani et al. (2017).

This blog does not aim to be a resource for either students or machine translation scholars, but a personal place where I can share random stories, interesting insights, or tutorials related to the field of machine translation.

Here are some resources to learn more about these topics:

  1. A Comprehensive Guide to the Backpropagation Algorithm in Neural Networks
  2. IBM's Introduction to Gradient Descent
  3. Quick Introduction to Weights (Gradients)

Footnotes

  1. Hutchins, J. (2002). Two precursors of machine translation : Artsrouni and Trojanskij.

  2. Rumelhart, D.E., Hinton, G.E., & Williams, R.J. (1986). Learning representations by back-propagating errors. Nature, 323, 533-536.

  3. Vaswani, A., Shazeer, N.M., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., & Polosukhin, I. (2017). Attention is All you Need. Neural Information Processing Systems.