Apple used Google’s tensor chips to train on-device and cloud-based Apple Intelligence
Apple Intelligence — the personal intelligence system for iPhone, iPad, and Mac — combines the power of generative models with personal context to deliver intelligence that’s useful and relevant to the user.
An Apple research paper published on Monday showed that the company used chips designed by Google rather than industry leader Nvidia to train both its forthcoming on-device and cloud-based Apple Intelligence tools and features,
Apple’s decision to rely on Google’s cloud infrastructure is notable because Nvidia (NVDA.O), opens new tab produces the most sought-after AI processors.
In the research paper, Apple did not explicitly say that it used no Nvidia chips, but its description of the hardware and software infrastructure of its AI tools and features lacked any mention of Nvidia hardware.
The iPhone maker said that to train its AI models, it used two flavors of Google’s tensor processing unit (TPU) that are organized in large clusters of chips.
To build the AI model that will operate on iPhones and other devices, Apple used 2,048 of the TPUv5p chips. For its server AI model, Apple deployed 8,192 TPUv4 processors.
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
MacDailyNews Take: Makes perfect sense, with Apple in catch-up scramble mode, to use what’s readily available to start; Apple Silicon usage can come later. Plus, Apple’s AI guru, John Giannandrea formerly led Google’s artificial intelligence efforts.
More info can be found in Apple’s white paper, “Apple Intelligence Foundation Language Models” (July 29, 2024), here.
Please help support MacDailyNews. Click or tap here to support our independent tech blog. Thank you!
Support MacDailyNews at no extra cost to you by using this link to shop at Amazon.
The post Apple used Google’s tensor chips to train on-device and cloud-based Apple Intelligence appeared first on MacDailyNews.