Apple’s approach with AI is to understand personal context when delivering answers and carrying out tasks.
Apple’s approach with AI is to understand personal context when delivering answers and carrying out tasks.
Apple Intelligence, the company’s long-awaited deeper push into AI, made its debut at the company’s Worldwide Developers Conference on Monday. It will be integrated into iOS 18, iPadOS 18 and MacOS Sequoia, and OpenAI’s ChatGPT technology will be more deeply tied into those platforms as well.
AI has played an important role in Apple products for years. But the widespread popularity of OpenAI’s ChatGPT chatbot has shone a bigger spotlight on generative AI — models trained on large swaths of data that can create content in response to prompts. As rivals such as Samsung, Google and Microsoft have been infusing the tech into their most important products, all eyes have been on Apple to see how the company plans to bring generative AI to its own products. At WWDC 2024, we’re finally getting that answer.
WWDC 2024: Everything Apple Announced
WWDC 2024 Live Blog: iOS 18, AI, Siri and More
Apple Intelligence Brings New iPhone Features, ChatGPT Integration
Understanding personal context when delivering answers and carrying out tasks is a big part of Apple’s approach with Apple Intelligence. Apple seems to be using this tactic as a way to distinguish its own AI efforts from those previously announced by competitors. As an example, the company explained how Apple Intelligence can understand multiple factors like traffic, your schedule and your contacts to help you understand whether you can make it to an event on time.
Apple also started out the AI portion of its keynote with an emphasis on privacy. It explained how many of its AI models run on-device, which is generally considered to be more private since information doesn’t have to travel over the internet. At times when a task may require larger models, Apple will use a system called Private Cloud Compute. When needed, tasks that require more computing power will run on servers Apple has specifically created that run on Apple Silicon. Apple Intelligence will decide whether a task can be processed on-device.Siri upgrades
Siri is a big part of Apple’s AI push. The voice assistant is becoming more natural, relevant and personal with this update. You’ll be able to speak in more natural language, and Siri should be able to understand you even if you stumble all over your words. Since it understands context, you can ask follow up questions, and Siri should know what you mean. Apple’s virtual assistant will support text input as well.
It’s also getting better at helping you use your iPhone since it will be able to answer questions about specific iPhone features.