Apple plans for a smarter LLM-based Siri smart assistant

news analysis
Nov 22, 20245 mins

Soon, it might even be able to open the cabin door for you.

Once upon a time, we’d say software is eating the planet. It still is, but these days our world is being consumed by generative AI (genAI), which is seemingly being added to everything. Now, Apple’s Siri is on the cusp of bringing in its own form of genAI in a more conversational version Apple insiders are already calling “LLM Siri.”

What is LLM Siri?

Apple has already told us to expect a more contextually-aware version of Siri in 2025, part of the company’s soon-to-be-growing “Apple Intelligence” suite. This Siri will be able to, for example, respond to questions and requests concerning a website, contact, or anything else you happen to be looking at on your Mac, iPhone, or iPad. Think of it like an incredibly focused AI that works to understand what you are seeing and tries to give you relevant answers and actions that relate to it.

That’s what we knew already. What we learn now (from Bloomberg) is that Apple’s AI teams are working to give Siri even more capabilities. The idea is to ensure Apple’s not-so-smart smart assistant can better compete against chatbots like ChatGPT, thanks to the addition of large language models (LLMs) like OpenAI or Gemini already use. 

What will Smart Siri do?

This smarter Siri will be able to hold conversations, and drill into enquiries, just like those competing engines — particularly Advanced Voice Mode on ChatGPT. Siri’s responses will also become more human, enabling it to say, “I have a stimulating relationship with Dr. Poole,” and for you to believe that.

These conversations won’t only need to be the equivalent of a visit to the therapist on a rainy Wednesday; you’ll also be able to get into fact-based and research-focused conversations, with Siri dragging up answers and theories on command.

In theory, you’ll be able to access all the knowledge of the internet and a great deal of computationally-driven problem solving from your now-much-smarter smartphone. Apple’s ambition is to replace, at least partially, some of the features Apple Intelligence currently hands off to ChatGPT, though I suspect the iPhone maker will be highly selective in the tasks it does take on.

The company has already put some of the tools in place to handle this kind of on-the-fly task assignment; Apple Intelligence can already check a request to see whether it can be handled on the device, on Apple’s own highly secure servers, or needs to be handed over for processing by OpenAI or any other partners that might be in the mix.

When will LLM Siri leap into action?

Bloomberg speculates that this smarter assistant tech could be one of the highlight glimpses Apple offers at WWDC 2025. If that’s correct, it seems reasonable to anticipate the tech will eventually be introduced across the Apple ecosystem, just like Apple Intelligence.

You could be waiting a while for that introduction; the report suggests a spring 2026 launch for the service, which the company is already testing as a separate app across its devices.

In the run-up to these announcements, Siri continues to develop more features. As of iOS 18.3 it will begin to build a personal profile of users in order to provide better responses to queries. It will also be able to use App Intents, which let third-party developers make the features of their apps available across the system via Siri. ChatGPT integration will make its own debut next month.

Will it be enough?

Siri as a chatbot is one area in which Apple does appear to have fallen behind competitors. While it seems a positive — at least in competitive terms — that Apple is working to remedy that weakness, its current competitors will not be standing still (though unfurling AI regulation might put a glass ceiling to limit some of their global domination dreams).

Apple’s teams will also be aware of work in the background taking place between former Apple designer Jony Ive and Sam Altman’s OpenAI, and will want to ensure it has a moat in place to protect itself against whatever the fruits of that labor turn out to be.

With that in mind, Apple’s current approach — to identify key areas in which it can make a difference and to work towards edge-based, private, secure AI — makes sense and is likely to remain the primary thrust of Apple’s future efforts.

Though if there’s one net positive every Apple user already enjoys out of the intense race to AI singularity it is that the pre-installed memory inside all Apple devices has now increased. Which means that even those who never, ever, ever want to have a conversation with a machine can get more stuff done quicker than before. Learn more about Apple Intelligence here.

You can follow me on social media! Join me on BlueSky,  LinkedInMastodon, and MeWe

Exit mobile version