Large language models are currently the solution for everyone. The technology’s versatility is part of its appeal: the use cases for generative AI seem both immense and infinite. But then you use the hardware, and there’s not enough of it that works very well. And you wonder what we’re actually accomplishing here.
On this episode of The VergecastNilay joins the show full of thoughts on the current state of AI, especially after spending a summer trying to get her smart home to work. But before we get to that, let’s talk about our new ad-free podcast option, launching this week! (If you’re a subscriber, go to your account settings to find streams of all our shows.) We also talk about Apple’s new M5-powered MacBook Pro, iPad Pro, and Vision Pro, and wonder just how big a chip bump really is.
After that, it’s time to talk about AI. We’re talking about the state of AI assistants, which are clearly the killer consumer app for LLMs, and which no one can create particularly well yet. ChatGPT and others are fun to use! But that is not the same thing as being useful – especially not in the omnipresent, omniscient way that we actually need. The ideal product here is both obvious and enticing, but we don’t feel like we’re close. Unless you’re really eager to talk to your laptop.
If you want to learn more about everything we discuss in this episode, here are some links to get you started, first in podcast and gadget news:
And in the lightning round: