Amazon Thinks Chatbots Can Fix Alexa’s Most Infuriating Flaws
Large language models like the kind that power ChatGPT are now at the backbone of the Alexa experience.
If you were sick of hearing about AI-powered chatbots, I’ve got some bad news for you: Amazon — and pretty much every other tech company with an app for that matter — is pushing full steam ahead.
According to Amazon, which held its big annual Echo and Fire TV hardware event this week, chatbots are going directly into one of the company’s most ubiquitous products outside of its core e-commerce app. I’m talking about Alexa.
I know chatbot integration may no longer be big news — every tech company looking to juice its stock value seems to have already hopped on the AI hype train — but Amazon is promising more than just a ChatGPT knockoff. They’re promising a full-blown Alexa evolution.
Alexa with a chatbot twist
It’s no secret that voice assistants may not have been all they were cracked up to be.
While Google and Amazon have gone to great lengths to load households up with gadgets you can yell at, most of the core problems that have made voice assistants annoying to use linger.
There’s a laundry list. They don’t understand multi-step commands; they sometimes mishear you; they give the more privacy-conscious of us nightmares. In short, they’re generally just not as easy or as fun to use as we’d like.
As someone who has used a voice assistant to command a makeshift smart home (smart apartment, actually) for years now, I feel that pain every time I wake up and go to shout my hipster AC off, and instead, my Nest Mini plays Sia. Alexa: please make it stop.
But after all these years, Amazon seems to have settled on the solution to those voice assistant woes: chatbots. With the help of its own large language model (LLM), Amazon says that Alexa is about to get a lot smarter.
For one, Amazon says LLMs will make Alexa more conversational. This week, Amazon’s Dave Limp demonstrated that capability to some success in a live demo.
Instead of the typical stiff, utilitarian responses, Alexa cracked a couple jokes here and there, going as far as to call itself the “12th man” in a question (a question he had to repeat more than once) about whether the voice assistant is a fan of the NFL’s Seattle Seahawks. Fun. Cute. But not really the point.
I can’t speak for everyone, but I don’t personally want or need my voice assistant to be adorable. I’d be fine with it being efficient, accurate, and reliable. According to Amazon, LLMs are here to fix that part too.
In its presentation, Amazon promised a few pretty eye-opening things. One of those standout promises to me is the new Alexa’s ability to better understand human speech.
Per Amazon:
Understanding these nuances is critical for making Alexa more intuitive. If I just told you I was cold, you’d know exactly what to do — turn up the heat —and Alexa can now process that same level of ambiguity. Just say “Alexa, I’m cold,” and Alexa will turn up the temperature. Or, “Alexa, it’s too bright in here,” and Alexa will dim the lights.
It may not seem like much, but let's not forget, voice assistants are supposed to make things easier. If you’re constantly having to couch your questions in a weird manner or unsure of what Alexa can and cannot do, then Alexa products lose that utopian convenience.
And according to Amazon the benefits don’t stop there. The LLM-powered Alexa will also apparently be better at understanding multi-step commands, too. That means anyone with an Alexa-powered device can execute more complex commands like setting Routines.
According to Amazon:
... the LLM gives you the ability to program complex Routines entirely by voice — customers can just say, “Alexa, every weeknight at 9 p.m., make an announcement that it’s bed time for the kids, dim the lights upstairs, turn on the porch light, and switch on the fan in the bedroom." Alexa will then automatically program that series of actions to take place every night at 9 p.m.
Again, if you aren’t using voice assistants on a regular basis, this may not seem like a big deal. But anyone that’s ever lobbed a multi-step command at a current-gen voice assistant knows the frustration a command like that would normally bring.
Alexa with LLM won’t just rely on its own AI acumen to make things easier either. Amazon also announced that it would link device makers to its LLM through programs called Device Controller and Device Actions where third parties can describe the actions their gadgets do.
Theoretically, this will tell the chatbot just what kind of device it’s dealing with and the bot can then infer a wider range of commands, i.e. “make my lights spooky” instead of “set my lights to purple.”
Alexa 2.0
In some ways, Amazon’s promises are ho-hum. It’s hard to qualify the ability to better turn lights on and off as a breakthrough. But as a believer in smart homes and the way they can augment our lives, I’m personally stoked to see a company tackling the crux of the problem. I think voice assistants can be the future even if we’ve been waiting around for too long for that moment to pass.
Amazon hasn’t said it, but in many ways what it’s really promising, with the help of LLMs and chatbot technology, is an Alexa 2.0. That’s to say, an Alexa that’s smarter, faster, and hopefully a lot less likely to play Sia instead of turning off your AC.
I won’t know for sure until I get to shout out a next-gen piece of plastic for myself, but I’m personally ready for the Alexa evolution if it’s as seamless as Amazon suggests.
This article was originally published on