Monday, December 23, 2024

Intel betting on AI well beyond data centers

Must read

Intel is preparing for rising demand for consumer products that can handle artificial-intelligence computing tasks on-site, betting that businesses will increasingly look beyond public data centers to meet their AI computing needs.

The U.S. chip-making giant is seeking to make AI capabilities commonplace in products ranging from computers to edge computing to software, with some companies already moving data away from public clouds to private environments, said Alexis Crowell, vice president and chief technology officer for Intel’s Asia-Pacific and Japan operations.

“Our strategy is to get AI everywhere,” Crowell said in an interview. “If only one solution in the world can compute AI, you’re stuck. If every single…chip and software can compute AI, now you’ve got the flexibility.”

Rising enthusiasm for AI services and the vast amounts of computing power they require have led to a boom in data centers worldwide, including in Asia, where companies such as Alphabet, Microsoft and Amazon are spending billions to build up their cloud services and other computing infrastructure. Morgan Stanley estimates that Asia will account for more than a third of data-center capacity globally by 2027, with $100 billion in potential investments this decade.

Crowell said the pace of demand for traditional data centers, however, is likely to slow as companies and organizations look to strike a balance between using public cloud infrastructure and storing data privately.

“I don’t think that massive burst is going to continue at that steep a trajectory,” Crowell said. “Some of the data that I’m seeing…is now saying that companies are trying to bring back [outsourcing of IT] for data-privacy reasons or cost-control reasons.”

That aligns with a recent report from research firm IDC, which estimated that by next year, 75% of enterprise-generated data globally will be created and processed on the edge, outside of traditional data centers or the cloud.

Intel is known mainly for its dominance in the market for central processing units, the brains behind personal consumer computers and the servers that run corporate networks and the internet.

The Silicon Valley company’s other big business—supplying chips that power data centers—has been challenged in recent quarters. Data-center operators’ budgets are being eaten up by AI chips, an area where Nvidia dominates and Intel has a smaller foothold. Revenue from Intel’s data center and AI division rose about 5% to $3 billion in the first quarter, compared with overall revenue that climbed about 9% to $12.7 billion.

Intel is seeking to change that as part of efforts to become the world’s second-largest contract chip maker by 2030. Last month, it unveiled a third generation of its Gaudi 3 AI chip that it said outperforms Nvidia’s H100 in training speed and power efficiency and will help it earn $500 million in AI-chip revenue in the second half of 2024. It said the chip will become available to companies including Dell Technologies, Hewlett Packard Enterprise and Lenovo this quarter.

“There are use cases where a public cloud infrastructure is the right thing for the company, for the customers, for that environment, [and] there are use cases where, no, we [have] to have all of this data completely private for sovereignty reasons or whatever else,” Crowell said.

“It’s just a matter of companies and organizations, including universities and governments, figuring out what’s the right balance,” she said.

Write to Kimberley Kao at kimberley.kao@wsj.com

Latest article