At Qualcomm’s Snapdragon Summit in Hawaii,, I got a glimpse of how I’ll interact with my phone in the future. Unfortunately, the future is not quite ready for us yet.
The big promise of AI agents is that they’ll be able to handle tasks for you — using their knowledge and understanding of you and what’s stored in your phone to suggest, predict and automate what you need, to ease the burden on you.
For the most part, the situations in which we’d use AI agents in our day-to-day lives have so far been largely hypothetical. But at Qualcomm’s Snapdragon Summit in Hawaii, I got a first-hand look at how we might use an agent to complete a routine task: uploading content to social media.
Using a prototype phone packing Qualcomm’s new Snapdragon 8 Elite Gen 5 chip, I asked the device, using my voice, to find all pictures of beaches stored in the Photos app. A large language model (LLM) running on the device picked up what I was saying and interacted with a vision model that classifies all the photos on the phone. It pulled up two options.