Futurism News

Google’s neural machine translation system has surprisingly developed its own internal language

google
Share on Facebook
Tweet about this on TwitterShare on Google+Share on StumbleUponShare on LinkedInPin on PinterestShare on Reddit

For the past couple months, Google has acquainted us with its mission to improve translation using its latest machine learning techniques. Dubbed Neural Machine Translation, the neural network has the capability to break the input sentence, translate it and  automatically spew out the translated sentence. Earlier this was introduced for just English and Chinese translations but has since been extended to eight more languages in the official Translate app.

Teaching the neural networks to learn and translate certain language pairs got the Google Brain researchers working on the project curious to find out something unexpected. They wanted to observe if the translation system was taught to translate from English to Japanese, English to Korean and the vice-versa as well, then would the system be able to translate Korean input sentences into a Japanese one? Also, the researchers wanted to find out if the system relied on the mediating language to translate the other languages in the respective pairs.

Well, to their surprise as well as ours, the neural networks were able to successfully translate the input sentence from Korean to a Japanese output sentence. In the official blog post, the researchers mentioned that the system was able to “generate reasonable Korean⇄Japanese translations, even though it has never been taught to do so.” This means no English was employed in the translation process. Google has reason to believe that it was the first time this type of transfer learning has worked in Machine Translation. The team has named the process ‘zero-shot translation.’ In the GIF shown below, the said translation system has been depicted using the ‘orange’ color trail.

google

This translation was made possible because, as Google explains it:

Our multilingual system, with the same size as a single GNMT system, shares its parameters to translate between these four different language pairs. This sharing enables the system to transfer the “translation knowledge” from one language pair to the others. This transfer learning and the need to translate between multiple languages forces the system to better use its modeling power.

Though the translation between the two languages might have not been perfect but the successful working of this concept rose another huge question among researchers. And this is the one question, which would get you excited, suprised and maybe scared at the same time. It was able to connect words and understand concepts that formally haven’t been taught to the system. So does this mean that the neural network has started forming a common representation — where words with same meaning, irrespective of the language, are inconnected on a deeper level?

Simply put, Google researcher started questioning if the translation system has developed it’s own language on an internal level?

To understand the same, they created a three-dimensional memory network of the system depicting translations for all the aforementioned languages. Here, if you look closely then you’d notice some red color networks. Within a language set, we can notice the meaning of all three languages insted of the two it has been initially taught. Thus, this explicitly means that the translation system has started developing its own ‘interlingua’ to relate words and their meanings. (Woahh!!)

google

Since there is currently no known way to understand the complex working of a machine learning-powered neural network, thus we cannot dissect the same and understand the exact working of the system. Though we are unaware if the translation system is relating certain words or sentences to its corresponding meaning across languages but the computer has definitely developed some relation.

This development could make the learning process easier for other languages which Google had to teach the neural network. Earlier, Google needed to build and maintain many different systems in order to translate between any two languages, incurring significant computational cost. But it could now employ the smart ‘zero-shot translation’ system and input some inter-related samples to perfect the system.

 

A hands-on guy fascinated by new apps, technologies and enterprise products.

[email protected]


Add Comment

Click here to post a comment

Your email address will not be published. Required fields are marked *