Last week, at WWDC 2024, Apple unveiled its new AI system, Apple Intelligence. It will be available across its iOS, iPadOS, and macOS devices, especially the iPhone 15 Pro and M-series iPads and Macs. With new AI features locked into the newest products, Apple’s AI chief recently revealed why the new AI system is limited to its flagship smartphone and Apple’s silicon-powered iPad and Mac.
Appearing on The Talk Show Live (spotted by The Verge), Apple’s VP of Machine Learning and AI Strategy John Giannandrea explained that running large language models (LLMs) requires a lot of computing power, so hardware that uses LLMs it must be fast and powerful enough to handle it.
“Completing large language models is computationally prohibitively expensive,” explained Giannandrea. “So it’s a combination of bandwidth on the device, it’s the size of the ANE, it’s the oomph on the device to make these models fast enough to be useful. In theory, you could run these models on a device too old. But it would be so slow as to be useless.”