Sunday, December 22, 2024

NVIDIA Teams With Google Quantum AI to Enhance Processor Design Using Simulations

Must read

NVIDIA (NVDA, Financials) announced a collaboration with Google Quantum AI to accelerate the development of next-generation quantum processors; NVDA stock is up 3% in Monday’s early market trading.

The effort mimics physical interactions within quantum devices using NVIDIA’s CUDA-Q platform and Eos supercomputer, addressing hardware noise-related design concerns.

Google Quantum AI operates 1,024 NVIDIA H100 Tensor Core GPUs doing great simulations of quantum processor dynamics using a hybrid quantum-classical computing architecture. The simulations aim to lower the computing challenges caused by noise, thereby limiting the number of operations possible using quantum technology.

Research scientist Guifre Vidal of Google Quantum AI underlined the need of scalable quantum technology with controllable noise levels. “Using NVIDIA accelerated computing, were exploring the noise implications of increasingly larger quantum chip designs,” Vidal said.

At up to 40 qubits, the CUDA-Q platform allows simulations of quantum devices at a scale previously computationally impossible. Tim Costa, director of quantum and high-performance computing at NVIDIA, underscored the significance of this capability, noting, “Googles use of the CUDA-Q platform demonstrates the central role GPU-accelerated simulations have in advancing quantum computing.”

These developments enable simulations that used to take a week to finish run in minutes instead. Publicly accessible via the CUDA-Q platform, the underlying software will enable quicker design iterations for quantum hardware designers.

This advancement in the search of commercially feasible quantum computingwhich relies on hardware scalability to overcome noise issuesrepresents a progressive direction.

The CUDA-Q platform of NVIDIA and Google Quantum AI highlight a crucial use of artificial intelligence supercomputing in quantum technologies. The work emphasizes how simulation-based methods could solve practical quantum computing challenges.

This article first appeared on GuruFocus.

Latest article