Friday, November 22, 2024

Fei-Fei Li picks Google Cloud, where she led AI, as World Labs’ main compute provider | TechCrunch

Must read

Cloud providers are chasing after AI unicorns, and the latest is Fei-Fei Li’s World Labs. The startup just tapped Google Cloud as its primary compute provider to train AI models, a deal that could be worth hundreds of millions of dollars. But Li’s tenure as chief scientist of AI at Google Cloud wasn’t a factor, the company says.

During the Google Cloud Startup Summit on Tuesday, the companies announced World Labs will use a large chunk of its funding to license GPU servers on the Google Cloud Platform, and ultimately train “spatially intelligent” AI models.

A handful of well-funded startups building AI foundation models are highly sought after in the cloud services world. Some of the largest deals include OpenAI, which exclusively trains and runs AI models on Microsoft Azure, and Anthropic, which uses AWS and Google Cloud. These companies regularly pay millions of dollars for computing services, and could one day need even more as their AI models scale. That makes them valuable customers for Google, Microsoft, and AWS to build relationships with early on.

World Labs is certainly building unique, multimodal AI models with significant compute needs. The startup just raised $230 million at more than a billion-dollar valuation, a deal led by A16Z, in order to build AI world models. General manager of startups and AI at Google Cloud, James Lee, tells TechCrunch that World Labs’ AI models will one day be able to process, generate, and interact with video and geospatial data. World Labs calls these AI models “spatial intelligence.”

Li has deep ties with Google Cloud, having led the company’s AI efforts in 2018. However, Google denies that this deal is a product of that relationship, and rejects the idea that cloud services are just commodities. Instead, Lee said services, such as its High Performance Toolkit to scale AI workloads, and its deep supply of AI chips were a larger factor.

“Fei-Fei is obviously a friend of GCP,” said Lee in an interview. “GCP wasn’t the only option they looked at. But for all the reasons we talked about – our AI optimized infrastructure and the ability to meet their scalability needs – ultimately they came to us.”

Google Cloud offers AI startups a choice between its proprietary AI chips, tensor processing units or TPUs, and Nvidia’s GPUs, which Google purchases and has a more limited supply of. Google Cloud is trying to get more startups to train AI models on TPUs, largely as a means to reduce its dependency on Nvidia. All cloud providers are limited today by the scarcity of Nvidia GPUs, so many are building their own AI chips to meet demand. Google Cloud says some startups are training and inferencing solely on TPUs, however, GPUs still remain the industry’s favorite AI training chip.

World Labs chose to train its AI models on GPUs in this deal. However, Google Cloud wouldn’t say what went into that decision.

“We worked with Fei-Fei and her product team, and at this stage of their product roadmap, it made more sense for them to work with us on the GPU platform,” said Lee in an interview. “But it doesn’t necessarily mean it’s a permanent decision… Sometimes [startups] move onto different platforms, such as TPUs.”

Lee would not disclose how large World Labs’ GPU cluster is, but cloud providers often dedicate massive supercomputers for startups training AI models. Google Cloud promised another startup training AI foundation models, Magic, a cluster with “tens of thousands of Blackwell GPUs,” each of which has more power than a high-end gaming PC.

These clusters are easier to promise than they are to fulfill. Google’s cloud services competitor Microsoft is reportedly struggling to meet the insane compute demands of OpenAI, forcing the startup to tap other options for computing power.

World Labs’ deal with Google Cloud is not exclusive, meaning the startup may still strike deals with other cloud providers. However, Google Cloud hosts the majority of World Labs’ workloads today, and says it will try to maintain that business moving forward.

Latest article