Apple unveils AI-powered Siri search engine
Image: An Apple Store in Shibuya, Japan. Flickr user Dick Thomas Johnson // CC 2.0 License

Apple, which has been behind on the AI race ever since, is now preparing for a big aI overhaul. This would reportedly come in the form of ‘World Knowledge Answers’, an AI powered web search tool that would first go live on iOS, with planned expansion across Safari and Spotlight. Set to revamp how users interact with Siri, Safari, and Spotlight, the new feature is designed to transform Siri from a limited voice assistant into a fully fledged answering engine that can generate multimedia results on demand.

Unlike the current version of Siri, which often redirects users to Google or third-party sources, the upgraded system will produce summaries that blend text, images, video, and even local information. The goal is to rival emerging platforms such as ChatGPT, Google’s AI Overviews, and Perplexity. Apple is targeting an initial rollout in March 2026 with iOS 26.4, coinciding with a long-promised Siri update, and will follow this month’s introduction of the iPhone 17 lineup, which will run the base version of iOS 26.

When it comes to web search, Apple has largely had a partnership-first approach, such as having Google as the default search engine across iOS devices for well over a decade. The launch of World Knowledge Answers marks the company’s first direct challenge to that status quo, meaning that the Cupertino-headquartered tech titan is seeking to reduce long-term dependency on outside search partners, as well as asserting greater control over one of the high-value touchpoints in its ecosystem. This comes as the voice assistant landscape itself is transformed by rapid advances in generative AI, forcing Apple to catch up with (and potentially surpass) the progress of OpenAI, Google, and Amazon in this domain.

Apple has discussed expanding the engine to Safari and Spotlight, creating multiple touchpoints for AI-powered search across its ecosystem. That would mark the company’s most direct move into web search since its longstanding reliance on Google became the default option across iOS devices. At the core of the initiative is a new Siri architecture that integrates large language models (LLMs). Apple’s engineers are building three key components: a planner, which interprets voice or text input and decides how to respond, a search system, which retrieves information from the web or on-device data, and a summarizer, which compiles results into a concise, conversational answer.

This framework will allow Siri to move beyond fact-checking into better explanations, contextual answers, and task execution. Despite the aim to build its own foundation models, Apple is leaning on partners for certain functions. The Gemini-based tech would run on Apple’s Private Cloud Compute servers, giving the company control over data privacy while borrowing Google’s expertise in generative AI. The system is also expected to revive and expand Spotlight’s role. For years, Spotlight has offered basic answers about movies, sports, and people, but the new AI backbone could make it a central hub for both personal and web queries. Safari, meanwhile, would gain built-in AI results alongside traditional search links.

Apple is also evaluating Anthropic’s Claude model and in-house systems for various aspects of the project. Current plans suggest that Apple Foundation Models will continue to handle searches across personal data, keeping sensitive information away from third-party providers. The overhaul is not limited to web search. Apple’s teams are developing Siri features that better integrate with on-screen content, enabling the assistant to respond to context-specific commands. Users could, for instance, highlight text in an app and ask Siri to summarize or translate it, or navigate more precisely through device settings using voice alone.

The Tech Portal is published by Blue Box Media Private Limited. Our investors have no influence over our reporting. Read our full Ownership and Funding Disclosure →