I’ve been thinking a lot about where AI shows up next and how that matters more than what the AI can do. It’s not about waiting for the next big model. It’s about how intelligence gets embedded into the tools and touchpoints we already use or will soon rely on. Whether it’s performance wearables or search that understands context or even the ability to generate voice content from your phone on the fly, we’re seeing signals of something deeper. These aren’t just tools. They’re clues about the future.
In this week’s podcast episode, I explore a few of those signals. I look at Meta’s new glasses built with Oakley for athletes, Apple’s rumored interest in Perplexity, and the new mobile voice app from ElevenLabs. It’s a short episode but full of insight. I hope you’ll give it a listen.
Share this post