Alexa, What’s Your Future? Perhaps Finally Making Money
24.09.2023 - 01:11
/ tech.hindustantimes.com
When Amazon.com Inc. executive Dave Limp announced in August that he would leave by the end of this year, I wrote that he would depart without having satisfactorily answered the question of what Alexa was for and why the gadget-buying public truly needed the voice-assistant technology.
Amazon seemed be having similar queries — Limp's unit, reported to have been burning through $5 billion a year, was hit harshly by recent cutbacks at the company. Meanwhile, the explosion of AI, kick-started by the launch of ChatGPT, has drastically redefined what a digital assistant could be expected to do. Alexa looked comparatively stupid: ChatGPT users were writing essays; Alexa users were setting egg timers.
OpenAI's groundbreaking tool sent shockwaves through Silicon Valley, in no small part because it wrong-footed bigger tech companies that hadn't yet been able to put generative AI capabilities directly into the hands of consumers.
The shifting sands — ChatGPT,Amazon cutbacks, Limp's departure — have raised the question of where Amazon's devices unit goes from here. Limp's swan-song keynote this week, held at the company's new “HQ2” campus in Arlington, Virginia, attempted to show Alexa could close the gap on AI.
Crucially, the new phase of Alexa could mean a new business model in which users pay an additional monthly fee for a more sophisticated virtual assistant.
Will Alexa finally be able to make money for Amazon? In a nod to the question-and-answer relationship users have with Alexa, below is a conversation I had with Limp after his keynote. He's a chatter — it's edited for brevity and clarity. I'll jump in with my thoughts as we go.
Dave Lee: Last time we spoke, you were talking about ambient computing. Now you're instead calling it ambient “intelligence.” You mentioned “generative AI” within about 30 seconds of the keynote starting. ChatGPT has obviously had an impact. How has Amazon changed what it is doing?
Dave Limp, Amazon: Well, there's no way we could have shown you what we showed you [on Wednesday], and what we're going to be shipping before the end of the year, if we had started it when ChatGPT was announced. It's just impossible to do what you've seen us do in that period of time. For the past couple years in the background, we've been using generative AI [for building Alexa features]. But what has become pretty clear, at least to me and the team, is that as you feed these models more data, they get larger, they get better. And they haven't stopped getting better. The real hard work is fine-tuning that model for your use case. In the home, the use case is very different than on your phone or in your browser. And that's what we're spending the most time on now.
His argument is that Alexa offers a real-world