Tuesday, March 4, 2025

Broadcom boosts edge AI infrastructure for faster connectivity – SiliconANGLE

Must read

The evolution of artificial intelligence is fueling a surge in demand for cutting-edge edge AI infrastructure. Companies such as Broadcom Inc. are stepping up to deliver open, scalable and power-efficient solutions to meet this need.

Broadcom has made significant strides in AI infrastructure over the past year, according to Vijay Nagarajan (pictured), vice president of strategy and marketing for the Semiconductor Solutions Group at Broadcom. The company’s commitment to these principles has driven its product decisions and led to advancements across its infrastructure portfolio.

Broadcom’s Vijay Nagarajan talks with theCUBE about the company’s evolving AI infrastructure strategy, from high-speed connectivity to edge intelligence.

“In the last 12 months, I would say we’ve delivered to this vision, and we’ve also delivered to the product promises that we’ve made across all of our infrastructure products,” Nagarajan said. “There’s XPUs, there’s ethernet, [peripheral component interconnect express], optics and foundational technologies like SerDes and [digital signal processing]. We’ve had tremendous success.”

Nagarajan spoke with theCUBE’s Dave Vellante and Savannah Peterson at MWC25, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed Broadcom’s evolving AI infrastructure strategy, from high-speed connectivity to edge intelligence. (* Disclosure below.)

Enabling high-speed AI connectivity

SerDes technology plays a crucial role in AI infrastructure, facilitating fast, reliable communication between components, according to Nagarajan. These high-speed connections help maintain strong signal-to-noise ratios, ensuring data integrity across various products.

“SerDes needs to make sure that your signal gets across from one end of the link to the other despite all the noise that you would see in the electrical circuits,” Nagarajan said. “Every piece of connectivity silicon that we do requires the SerDes, and the quality of the SerDes makes those products that much more impactful and valuable.”

Broadcom recognizes the central role of SerDes in powering high-performance edge AI infrastructure. It has invested heavily in advancing its capabilities to keep pace with the growing demands of AI and machine learning. But SerDes is just one piece of the equation, according to Nagarajan.

“SerDes is important, but it’s equally important that other pieces of our technology are ready as well,” he said. “That includes advancements on the ethernet front and ensuring XPUs are equipped with the necessary capabilities.”

Edge AI infrastructure: The spinal cord of modern computing

Broadcom is extending its AI expertise beyond large-scale infrastructure, expanding its capabilities in edge AI infrastructure through its broadband access offerings, according to Nagarajan. A key development in this space is the company’s unified DOCSIS 4.0 chipset, which integrates AI and machine learning to enhance real-time processing in home networks.

“In the last 12 months, we had a couple of very interesting announcements that we made in the edge AI space,” Nagarajan said. “[DOCSIS 4.0] has a neural engine that is capable of supporting a lot of the AI use cases in your home. Think of your broadband gateway as your edge device, handling video processing for a great video experience.”

Beyond home connectivity, Broadcom is embedding AI-driven neural engines across its broadband ecosystem, including modems, smart applications and passive optical network devices. These advancements improve network optimization, power management and real-time decision-making, according to Nagarajan.

“From an edge perspective, we’ve made sure that the edge access network actually has neural capabilities … [and] we’ve integrated these neural engines not just in the modems, but also in the smart apps and nodes,” he said. “Or you take our … 50-gig PON devices. Today, they have neural engine spec, so the two sides of the link — the [optical line terminals] and the [optical network units] — have the neural engine. Think about what it can offer service providers and operators [such as] network management capabilities [and]  power management capabilities. The possibilities are endless.”

The relationship between edge AI infrastructure and centralized AI infrastructure can be thought of as analogous to the spinal cord and brain, according to Nagarajan. Just as the spinal cord processes reflexive actions before signals reach the brain, edge AI enables real-time, localized decision-making before data is sent to cloud or data center compute.

“When you go touch a hot object, your neural sensors drive your information back to the spinal cord and to the brain,” Nagarajan explained. “The spinal cord first does the processing and lets you know, ‘Hey, take your hand off.’ The information goes back to the brain … and the brain then triggers the pain sensation. You have a reflex that’s driven by the spinal cord, and then you have the brain doing the ultimate processing.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of MWC25:

(* Disclosure: TheCUBE is a paid media partner for MWC25. The sponsors of theCUBE’s event coverage do not have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU

Latest article