Apple will favor on-device AI over cloud processing, more evidence suggests

Daniel Sims

Posts: 1,384   +43
Staff
Why it matters: All signs would seem to indicate that future Apple devices will begin to leverage generative AI starting in the second half of this year, but exactly how remains uncertain. Meanwhile Microsoft, Intel, and Qualcomm are already pushing the AI trend, and recently unveiled new hardware that can run intensive AI tasks on-device to reduce reliance on the cloud. There's evidence suggesting that Apple intends to do the same.

Bloomberg's Mark Gurman reports that Apple will reveal its first AI applications later this year, noting that they will run locally without an internet connection. Gurman's comments align with prior speculation deduced from the company's hiring records.

The most popular generative AI tools, like ChatGPT, Bard, Copilot, DALL-E, and Midjourney, currently rely on cloud processing. However, tech giants have begun producing devices with increasingly powerful neural processing units (NPUs) dedicated to handling similar workloads.

On-device AI that never transmits data would suit Apple's privacy-focused marketing message, so it would be unsurprising if the company took that route. Other reports suggest that Cupertino could rely on Google or OpenAI for specific AI applications, but whether those would run on the cloud is unclear.

Another sign of Apple's plans is its recent acquisition of Paris AI firm Datakalab. The seven-year-old company specializes in image analysis and algorithm compression. The investment could be related to one of Cupertino's proposed AI projects – a multimodal language model that can interpret and count objects in images.

Although Apple has been slow to jump on the AI hype train, a pivot toward the technology is expected with the unveiling of next-generation iPhones and Macs later this year. In typical Apple fashion, we wouldn't be surprised if the AI software experience were simplified and integrated into the general UI, creating a more seamless user experience than what current implementations like Copilot or ChatGPT offer.

The iPhone 16 and iOS 18 will likely debut at WWDC in June, followed by multiple Macs in late 2024 and 2025 featuring the new M4 processor. The roadmap might cause the Mac mini to skip the M3 series in favor of M4.

The iPhone 16's A18 processor could introduce Apple's first major NPU upgrade in years, shifting from 16 to 32 cores. Its performance might significantly exceed the iPhone 15 Pro's 35 TOPs (trillion operations per second) and the Mac Studio's 31 TOPs. Information on the M4's NPU upgrades is scant, but leaks indicate the processor lineup will focus on local AI processing. Apple's products may match the "next-generation AI PC" standard that Microsoft and Intel recently defined, which lists an NPU capable of at least 40 TOPs.

Permalink to story:

 
I can think of plenty for and against reasons to do it.
One practical to me reason to make it local:
Whenever I come to events like Jul 4 fireworks, internet is almost
unusable. For someone who goes to places like this fairly often,
it would be more convenient if this feature did not depend on the internet.
 
If there's one thing I can give to Apple, it's that these AI things are better when they can be reasonably localized.
 
When and if Apple increases dedicated silicon for this, my bet is there will be solid real-world uses for it on day one within the Apple ecosystem. That will let them tout practical benefits (all with made up Apple-specific brand names of course) that will make for a much better feel.

Meanwhile, I'm already experiencing the "Copilot" button on my Windows taskbar, and the "AI-Powered" Edge browser, and neither has made me optimistic for the Windows AI PC. My guess is AI Explorer will be a lot like regular Explorer, but slower, with fewer available end-user customizations, more bugs, and less predictability, and a new un-pleasing UI displaying even fewer results at once. I do believe they'll make improvements (from their perspective) in how often and how quickly and in how many forms use of "my" computer is shared with third parties though.

I wish I wasn't so cynical about all this. I remember when I used to look forward to new O/S versions.
 
So iPhone 15 Pro has a more powerful NPU than Mediocre Lake and while Intel spruiks Mediocre Lake's AI, Apple thinks their current NPU's are rubbish.

Will be interesting to see Arrow Lake vs M4, though while Lunar Lake will have much more powerful NPU it's a 4+4 core U processor for premium thin and lights and in no way will compete against M4. Arrow Lake H/HX will have to carry the flag in higher end laptops.
 
I have developed a couple of apps that use AI and extensive image processing, one of them is loaded with AI up to its neck, it uses about 5 DNN models and other AI techniques. and ALL the processing is local, on the device, in fact, the apps do not use the data service at all, zero. There are more than enough reasons, security, privacy, costs and more
 
When and if Apple increases dedicated silicon for this, my bet is there will be solid real-world uses for it on day one within the Apple ecosystem. That will let them tout practical benefits (all with made up Apple-specific brand names of course) that will make for a much better feel.

Meanwhile, I'm already experiencing the "Copilot" button on my Windows taskbar, and the "AI-Powered" Edge browser, and neither has made me optimistic for the Windows AI PC. My guess is AI Explorer will be a lot like regular Explorer, but slower, with fewer available end-user customizations, more bugs, and less predictability, and a new un-pleasing UI displaying even fewer results at once. I do believe they'll make improvements (from their perspective) in how often and how quickly and in how many forms use of "my" computer is shared with third parties though.

I wish I wasn't so cynical about all this. I remember when I used to look forward to new O/S versions.
You can uninstall copilot and use a better browser (e.g., Brave).
 
If there's one thing I can give to Apple, it's that these AI things are better when they can be reasonably localized.


You forget the telemetry.

Why do you think Google, Microsoft, Intel and AMD are betting so much on AI lately for consumers?

It's big data.
 
You forget the telemetry.

Why do you think Google, Microsoft, Intel and AMD are betting so much on AI lately for consumers?

It's big data.
Are you insinuating that Apple can't data harvest from local AI? Because they certainly can...
 
Back