By Daniela Hernandez
Getting Amazon.com Inc.'s Alexa to work across the globe and in many languages requires that the company's engineers keep the "local experience" in mind, according to Toni Reid, Amazon's vice president for Alexa Experience and Echo.
The voice assistant has more than 100 million customers world-wide, she said Monday at The Wall Street Journal's Future of Everything Festival.
Developers have to understand differences in the way people speak. English in Australia is different from English in India, the U.S. or the U.K. They also have to keep users' accents in mind so that Alexa doesn't misinterpret or misunderstand commands. Mistakes mean that sometimes Amazon has employees listen to select Alexa recordings to improve its software. Recently, the tech giant came under fire after reports surfaced that employees were listening to customers' Alexas.
The company stressed customer trust and transparency were of utmost importance.
The Alexa voice assistant is now available in six different languages in more than a dozen countries. It can be difficult to scale up the number of languages voice assistants speak because of lack of data. Artificial-intelligence systems that power most modern voice assistants require large amounts of data to work well. For languages like English and Spanish, there is a lot of data on the internet on which to train AI.
To circumvent the data scarcity problem, AI developers are leveraging techniques such as transfer learning, according to Rohit Prasad, head scientist for Alexa AI at Amazon. Transfer learning is akin to learning from experience in humans, he said. Similarly, software can take insights learned from a data-rich task and apply it to a new problem, where data isn't available in large quantities.
Write to Daniela Hernandez at email@example.com