Monday, December 23, 2024

We just learned something surprising about how Apple Intelligence was trained

Must read

A new research paper from Apple reveals that the company relied on Google’s Tensor Processing Units (TPUs), rather than Nvidia’s more widely deployed GPUs, in training two crucial systems within its upcoming Apple Intelligence service. The paper notes that Apple used 2,048 Google TPUv5p chips to train its AI models and 8,192 TPUv4 processors for its server AI models.

Nvidia’s chips are highly sought for good reason, having earned their reputation for performance and compute efficiency. Their products and systems are typically sold as standalone offerings, enabling customers to construct and operate them as the best see fit.

Google’s TPUs, on the other hand, are only available to users as part of the company’s larger cloud services package. That is, you don’t own Google TPUs so much as lease access to them as customers are required to develop their AI models within the Google Cloud ecosystem.

This cloud requirement worked in Apple’s favor, per the research team. They noted that the ability to cluster Google’s TPUs enabled them to harness the necessary processing power to train Apple’s AI models and do so more efficiently than with a standalone system.

Apple’s decision to use Google’s products is unexpected, and not just because of the two companies’ longstanding rivalry. Nvidia boasts the dominant market share in terms of AI chips, its accelerators constituting between 70% and 95% of all sales.

However, Apple’s decision could be seen as a sign that tech companies are looking to move away from Nvidia’s pricey high-end chips. Amazon, for example, recently revealed that it is developing a new line of AI chips that are purportedly 50% more powerful than Nvidia’s offerings and operate using 50% less power.

Microsoft in May announced that it will offer its cloud customers services built atop AMD’s AI chips, rather than Nvidia’s, while Google made similar moves in April.

Latest article