
Apple’s seeming lag in the AI race isn’t just a case of being caught flat-footed. While companies like Google and OpenAI push ahead with powerful, conversational AI, Apple is taking a different route—one that prioritizes privacy. And that stance might be what's causing its headaches in delivering an AI-driven Siri experience that competes with the likes of Gemini or ChatGPT.
Before diving in, let me offer some context: I’m heavily invested in Apple products. I don’t love the term, but if we’re labeling, I’d be called an "Apple Fanboy." At the same time, I also use Google’s services professionally and personally. I pay for Workspace, I use Gemini, and I subscribe to ChatGPT Plus. In short, I live in both ecosystems, which gives me at least some perspective on the AI landscape at a basic user level.
The AI Assistant Usage Question
Siri is, by most accounts, lackluster. But before we ask why Apple hasn’t made it as powerful as Gemini or ChatGPT, we should consider: how many people actually talk to their phones as though they were AI-driven assistants?
Apple boasts over two billion active devices, of which ~69% are thought to be iPhones. But what percentage of those users are truly engaging with AI assistants in the way we see in promotional materials? I’d wager the number is far below 50% for anything other than simple tasks. Even within Google’s much smaller Pixel user base, I suspect it’s similarly low.
For those who do use AI assistants, the requests likely fall into three categories:
Simple tasks: Setting timers, making calendar appointments, controlling smart home devices and such like.
Conversational: Engaging in dialogue about life, personal preferences, or general queries.
Sophisticated requests: Asking the assistant to book flights, manage finances, or handle deeply personal information.
Of those, the majority likely fall into the first category. Hardly anyone is trusting an AI to handle complex tasks like planning a holiday using sensitive data, and even conversational AI usage remains niche. That raises the question: is Apple behind, or is it simply prioritising the kind of AI most people are comfortable using today?
Where Siri Fails (and Where It’s Not Alone)
Siri struggles even with basic commands, often failing at tasks that Google Assistant has handled competently for years. And while Apple is teasing improvements with "Apple Intelligence," the gap remains significant.
But is Google’s AI ecosystem that much better? Not always. Take summaries—one area where AI assistants should shine. Siri’s summaries are often useless. Yet, in my experience, asking Gemini to rewrite an email or summarise a Gmail thread produces similarly disappointing results. The issue isn’t just Apple’s slow progress; it’s that AI isn’t quite there yet in these areas.
Privacy: Apple’s Achilles' Heel or Competitive Edge?
This brings us to privacy—an area where Apple has staked its reputation.
In my professional work in digital analytics, privacy and data collection regulations are a constant concern. The likes of GDPR, PECR, CCPA and so on govern what organisations can and cannot do with personal data, and breaches come with hefty fines. Yet AI models like Gemini and ChatGPT don’t seem to be held to the same level of scrutiny. Users seem to willingly pour personal details into these models without much thought to where that data goes.
When DeepSeek surfaced, the initial excitement quickly turned to concern once privacy issues were exposed. And then there’s Google—an advertising giant that generates over 90% of its revenue from user data. Should I trust Gemini with deeply personal information stored on my phone? Not a chance.
Apple’s approach is different. The company has long marketed itself as the guardian of user privacy, and it’s likely this philosophy is what’s slowing down its AI progress. If Apple is truly committed to a privacy-first AI assistant, it will need to build AI models that don’t rely on the same level of data access as Google’s products do. That’s no small challenge, but it could ultimately be a long-term win.
The Waiting Game: Is Privacy Worth the Delay?
If Apple users must wait years for a privacy-first, fully agentic Siri, will it be worth it? I’d argue yes.
In the meantime, Apple users who want advanced AI capabilities can simply install apps like ChatGPT or Gemini. These can provide intelligent conversation and task assistance without deep integration into the OS. That means Apple doesn’t need to rush a half-baked AI just to keep up. It can afford to take its time.
Google’s aggressive push into AI, while impressive, comes with a trade-off: more data collection, more user profiling, and more potential privacy concerns. Leaked documents have revealed instances where Google was found to be collecting user browsing data through Chrome, despite previously downplaying such concerns. Given Google's extensive data-driven business model, it raises questions about how Gemini might handle user data as well.
For now, I’m fine keeping certain tasks to myself, letting my own brain do the work rather than handing over ultra-personal data to any company (not just Google) that thrives on monetising it. If Apple’s slow AI progress means maintaining control over my privacy, I can live with that.
Conclusion
Apple’s lag in the AI race isn’t just about poor planning or lack of investment—it’s also a byproduct of its privacy stance. While Siri remains unimpressive, Apple’s measured approach might ultimately pay off by offering an AI experience that respects user data in a way Google’s may never.
In the short term, Apple users may feel left behind. But in the long run, a privacy-first AI might be worth the wait.
Update: 16 March 2025
Since this article was posted on 11 March 2025, Amazon has announced that, starting 28 March 2025, all commands spoken to an Alexa+ device will be sent to the cloud. Local processing will no longer be possible.
Amazon has a history of privacy controversies. A few years ago, it was revealed that Alexa recordings sent to the cloud were reviewed by employees to improve performance—some of which contained background conversations and other unintended audio.
Meanwhile, Google has announced that if users enable personalization, Gemini will use their search history to "enhance" certain responses. However, users can disconnect Gemini from their search history at any time. When search history is referenced, Google says responses will indicate this clearly and provide a disconnect link.
These updates highlight the growing tension between AI-driven convenience and user privacy. As companies continue making these choices, consumers must decide how much data they are willing to trade for improved functionality. However, transparency in how data is used—and whether users truly have meaningful control—remains key to maintaining trust. Ethical AI should empower users with clear, fair choices, not obscure trade-offs hidden in settings or vague policies.
Comments