Here are the new AI features coming to your iPhone with iOS 18

Macworld

It isn’t news that Apple is focusing on AI at the moment. Despite its early start with Siri, the company has fallen behind in the AI arms race (whether or not you regard that as a bad thing) and has made no visible progress at all with the latest generative AI trend. All of this is expected to be remedied by software announcements at WWDC next month.

What is news is the apparent revelation this past weekend of some actual, juicy details. In the latest edition of his Power On newsletter, Bloomberg leaker-analyst Mark Gurman claims Apple is about to make some “bold changes” to the iPhone’s iOS operating system, and reveals both broad strategic changes of direction for the company and specific features he predicts will be coming to the iPhone this year.

On the strategic front, Gurman identifies two of Apple’s longstanding focuses–to largely refrain from collecting user data, and to run a large proportion of Siri’s processing on the device–as the cause of some of the difficulties in this area. The lack of data, while motivated by a wish to safeguard user privacy, means a lack of training fuel for large language models. Similarly, on-device processing is in theory a better way to protect user data security, since information about requests doesn’t have to pass through server farms or be accessible to (likely outsourced) company employees, but this makes it more difficult to consistently deliver strong performance.

It would be a major reversal of company culture for data mining to become a priority, but as an acknowledgment that it needs to change, Gurman says it will transition to more of a cloud-based model for its data processing. Data centers will be upgraded with powerful Mac chips in order to handle the additional strain.

“The move shows that Apple recognizes the need to evolve,” Gurman writes. “As part of the changes, the company will improve Siri’s voice capabilities, giving it a more conversational feel, and add features that help users with their day-to-day lives.”

On the features front, meanwhile, Gurman says we can expect “services like auto-summarizing notifications from your iPhone, giving a quick synopsis of news articles and transcribing voice memos.” He also predicts that existing auto-populating features for Calendar and suggested apps will be improved, and that AI-based editing will appear in Photos, with the caveat that this won’t be as impressive as similar features already available on Adobe apps.

As intriguing as all of this sounds, it fails to address two of the most noteworthy elements of an AI strategy: chatbots and search. Sadly for Apple fans, it appears the company is well behind in both of these areas. Cupertino’s own generative AI tech isn’t strong enough to power a workable chatbot, so it’s instead planning to outsource OpenAI’s ChatGPT for this purpose–for the short term, at least.

In the long run, Gurman says Apple will need to make a chatbot of its own. It’s not dissimilar to maps, which began as a partnership with Google before Apple developed its own solution. “For now,” Gurman says, “[Apple] believes the combination of its homegrown AI features (both on devices and in the cloud) and the OpenAI deal will be enough to get the job done.”

Much the same applies to AI-powered search; Apple’s own search project never really got anywhere and it’s in no position to launch its own search engine. All of which adds up to an upcoming WWDC keynote that will feature plenty of small fireworks, but is likely to leave users with the feeling that Apple is still a long way from catching up in the AI race.

For all the latest news, rumors, and information about Apple’s June announcements, check our WWDC superguide.

iOS, iPhone