AI

Alexa’s next trick? Figuring stuff out itself

Amazon is expanding the scope of self-learning features for its voice assistant.
article cover

Yagi Studio/Getty Images

· 3 min read

In 2020, Amazon encouraged users to teach Alexa how to interpret certain requests.

Now, like a Montessori teacher, Amazon wants the AI system to figure some things out on its own.

This approach, which Amazon dubbed “self-learning,” is all about decreasing friction and making human-AI interaction feel more natural. In the last year, the company has begun adding a suite of such features to consumer devices worldwide.

Self-learning boils down to advancing Alexa’s capacity for common-sense reasoning, which is one of AI’s most notorious obstacles. More widely known as “self-supervised learning,” it’s essentially a form of AI learning that’s, well…not supervised. Instead of learning from human-labeled data, the AI analyzes data on its own to draw conclusions.

Self-supervised learning is one reason why natural language processing—a subset of AI that underpins tools from Google Search to auto-complete—has advanced so much in recent years. Meta has used it for image training; Google has used it for medical-image classification; and Amazon has used it for automated speech recognition.

“Instead of a bunch of scientists taking data and trying to update the models and then roll them out, self-learning is something that works out in the field—there’s no human intervention, no human involvement,” Prem Natarajan, Alexa AI’s VP of natural understanding, told us. “It’s directly learning through its interactions with customers, whether it’s implicit customer behavior or otherwise.”

For its part, Alexa’s self-supervised learning practice still uses deep learning architecture and pre-trained language models as a foundation, but the team wants to use data from user interactions to fill in some of Alexa’s common-sense gaps.

What that looks like: Let’s say you’re trying to impress your friend who has great taste in music. Before they arrive, you tell Alexa to play “chill” music, expecting it to queue up artists like Japanese Breakfast and Mitski. But the device responds that it doesn’t understand your request. So you switch gears and tell it to start Spotify’s “Indie Pop” playlist. From that friction, Alexa can infer that “chill” and this “Indie Pop” playlist are in some way related—and potentially use that information for future user requests.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

The above example, called “rephrasing of the customer request,” was a self-learning feature the Alexa AI team rolled out in the past year, Natarajan said—part of an ongoing worldwide expansion of self-learning features. In recent months, the team introduced another self-learning feature on consumer devices, focused on which application would be best for handling an incoming request.

“We haven’t yet gotten anywhere close to tapping out on supervised deep learning,” Natarajan said. He added, “Self-learning, so powerful, is still in its infancy…If you can reason, then you can self-learn better because you can eliminate unreasonable learning.”

To do that, the AI team has invested in Alexa’s “awareness of ambient state” and deep learning focused on a customer’s “latent goal,” Natarajan said. For example, if you asked Alexa to turn off a lamp but the device couldn’t understand whether you said “on” or “off,” it could use its knowledge of the lamp’s ambient state to make a judgment call.

“When we look out into the next year or two, we kind of see foundational advances in AI improving, reducing the friction, making the experience much smoother for customers…to use common-sense knowledge, other world knowledge, or ambient-state information…to say what is the best action for them,” Natarajan said.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.