Meta’s AI translator can interpret unwritten languages

Almost half of the approximately 7,000 known languages ​​in the world, four out of ten, exist without an accompanying written component. These unwritten languages ​​pose a unique problem for modern machine learning translation systems, as they typically need to convert verbal language into written words before translating to the new language and converting the text back to speech, but one that Meta reportedly does with its latest Open – AI progress in the source language.

As part of Meta’s Universal Speech Translator (UST) program working to develop real-time speech-to-speech translation for easier interaction (read: ). As part of this project, meta-researchers studied Hokkien, an unwritten language spoken across the Asian diaspora and one of Taiwan’s official languages.

Machine learning translation systems typically require extensive labelable samples of the language, both written and spoken, to train on – exactly what unwritten languages ​​like Hokkien don’t have. To get around this, “we used Speech-to-Unit Translation (S2UT) to convert input speech directly into a sequence of acoustic units, in the way previously pioneered by Meta,” explained CEO Mark Zuckerberg in one blog post on Wednesday. “Then we generated waveforms from the units. Additionally, UnitY has been adopted for a two-pass decoding mechanism, where the first-pass decoder generates text in a related language (Mandarin) and the second-pass decoder creates units.”

“We used Mandarin as an intermediate language to create pseudo-labels where we first translated speech from English (or Hokkien) to Mandarin text and then translated to Hokkien (or English) and added to the training data,” he continued away. Currently, the system allows someone who speaks Hokkien to converse with someone who speaks English, albeit in a stilted manner. The model can only translate one full sentence at a time, but Zuckerberg is confident the technique will eventually be applied to more languages ​​and will improve to the point where real-time translations are offered.

In addition to the models and training data that Meta is already open-sourcing from this project, the company is also releasing a unique speech-to-speech translation benchmarking system based on a Hokkien language corpus called Taiwanese Across Taiwan, as well “The Speech Matrix, a large corpus of speech-to-speech translations mined using Meta’s innovative data-mining technique called LASER,” announced Zuckerberg. This system will enable researchers to create their own speech-to-speech translation (S2ST) systems.

All products recommended by Engadget are selected by our editorial team independently from our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may receive an affiliate commission. All prices are correct at time of publication.

https://www.engadget.com/metas-ai-translator-can-interpret-unwritten-languages-150001511.html?src=rss Meta’s AI translator can interpret unwritten languages

Russell Falcon

USTimesPost.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@ustimespost.com. The content will be deleted within 24 hours.

Related Articles

Back to top button