Despite Apple partnering with OpenAI to integrate the AI giant’s generative artificial intelligence (gen-AI), Apple is all set to develop a more conversational version of Siri.
Siri is Apple’s digital assistant tool for iOS, macOS devices, tvOS, and watchOS devices. It uses voice recognition to carry out tasks the users prompt.
In December, Siri will also be equipped with new capabilities powered by Apple Intelligence, the tech giant’s AI. The company says that built-in intelligence features will make Siri more capable, personal and helpful.
Bloomberg reported that Apple Inc. is now racing to develop a more conversational version of its Siri digital assistant, aiming to catch up with OpenAI’s ChatGPT and other voice services.
It was further reported that this new version of Siri is leveraging more advanced large language models (LLMs) to allow for back-and-forth communication.
“The system also can handle more sophisticated requests in a quicker fashion,” anonymous sources told Bloomberg.
Apple’s LLM Siri
Called “LLM Siri,” Apple Intelligence plans to train on huge amounts of data that identify patterns and answer questions.
Apple has been allegedly testing a new upgrade software on iPhones, iPads, and Macs in a separate app. The technology is aimed to eventually replace the current Siri interface.
The tech giant plans to announce this overhaul as early as 2025 when it releases new updates for iOS 19 and macOS 16, codenamed "Luck" and "Cheer," respectively.
However, this new feature will likely not be introduced to Apple’s hardware collection in the new year. The company is preparing to release the revamped Siri to consumers as early as spring 2026 according to Bloomberg.
LLM Siri is expected to communicate with consumers like a human and not seem as robotic as other conventional chatbots. It will handle tasks that are similar to OpenAI’s ChatGPT and Google’s Gemini.
Apple's upgraded AI-powered digital assistant aims to enable more precise control over third-party applications by expanding the use of App Intents.
Apple Intelligence will likely support Siri’s capability to write and summarise text in addition to other capabilities still unknown.
Apple is reportedly developing a next-generation LLM for iOS 19, designed to enhance its AI capabilities and provide more advanced features, similar to ChatGPT.
Also Read: Apple Confirms macOS Targeted in Zero-Day Vulnerability Cyber Attacks
What is Siri?
Siri is a conversational digital assistant developed by Apple Inc. It was first introduced to the world in its iPhone 4S smartphone devices in October 2011.
Generally referred to as a ‘virtual assistant’, the digital tool was named after the Norwegian name Sigrid by co-creator, Dag Kittlaus. The name originates from the Old Norse words sigr meaning “victory” and fríðr meaning "beautiful."
Siri’s origins can be traced back to a military project – the Cognitive Assistant that Learns and Organizes (CALO) project, which was funded by the Defense Advanced Research Projects Agency (DARPA). The project aims to create a virtual assistant to help military personnel manage their daily activities.
Siri Key Benefits
1. Powered by Apple Neural Engine
Siri, Apple’s virtual assistant isn’t linked to the user’s Apple ID. It’s backed by the power of Apple Neural Engine which ensures that the audio of users’ requests never leaves your iPhone, iPad, Apple Watch or Apple Vision Pro unless they choose to share it.
2. Data Security
Apple says that this is an on-device intelligence tool that makes the user experience with Siri personal. It learns the users’ preferences and what they might want while maintaining their privacy. The tech giant emphasised that users’ data shared with Siri is never shared with advertisers.
Siri Key Challenges
1. Complex performance
Currently, the iOS model relies on Apple's first-generation LLM. This is a limitation because Siri's performance can be hindered by its reliance on Apple's first-generation LLM and the need to route complex queries to a secondary, third-party LLM within the iOS 18 framework.
This can result in slower response times and less accurate results. Often Siri misunderstands and misinterprets users prompts too. It lacks contextual understanding overall, tus, finds it hard to maintain context in a conversation. Right now, Siri’s responses are more robotic but Apple plans to make these interactions more human-like.
2. Limited Integration
Siri’s integration with third-party applications is limited which can make it even more difficult to perform complex tasks. This means it cannot navigate around the features offered by third-party applications.
For instance, if a user prompts Siri to play music or send messages to ‘x’, it sometimes struggles to break down the tasks and follow through accurately. This limitation can hinder Siri's usefulness and make it less competitive compared to other voice assistants that offer deeper integration with third-party services.
3. Outdated Interface
Siri’s interface when compared with newer voice assistants is not as accurate. It can be slower to interpret users prompts and may be less intuitive then models like ChatGPT or Gemini.
Siri’s voice too sounds less natural but Apple plans to address this in the future and revamped Siri.
Also, Siri's on-screen interface can sometimes be cluttered and difficult to navigate, especially when multiple tasks or requests are involved. This can lead to frustration and confusion, particularly for users who are less tech-savvy.
How to Use Siri?
1. Siri on iPhone
Just say “Siri”* or “Hey Siri”, then immediately say what you need. Set up how to get Siri’s attention in Settings > Siri or Settings > Apple Intelligence & Siri, then tap Siri or Hey Siri.
Press and release the button
- If your iPhone has a Home button, press the Home button, then make your request.
- If your iPhone doesn't have a Home button, press the Side button, then make your request.
- To make a longer request, press and hold the Side or Home button until you’re finished with your request.
Type instead of speaking to Siri
If you have Apple Intelligence turned on, double-tap the bottom of the screen and type your request to Siri.
If you do not have Apple Intelligence, Turn on Type to Siri. Then press the button to activate Siri and type your request.
2. Siri on Mac
Using voice
On supported Mac models, say “Siri”* or “Hey Siri”, then make your request.
Use Siri button
- In the menu bar or Dock, click the Siri button, then say what you need.
- On a Mac with a Touch Bar, tap the Siri button, then make your request.
- On an Apple keyboard with function keys, press and hold the Dictation key, then make your request.
- To make a longer request, hold the Siri button until you’re finished with your request.
Type instead of speaking to Siri
Enable Type to Siri. Then press or click the Siri button and type your request.
Also Read: What is Apple Intelligence? Here's everything you should know