The Ultra Accelerator Link (UALink) Consortium anounced today that Apple, Alibaba, and Synopsys have joined its board of directors. The expansion signals growing industry support for open standards in AI hardware connectivity, potentially challenging Nvidia’s current dominance in the space.
Key Points
- UALink is a high-speed interconnect standard aimed at optimizing AI cluster performance by connecting hundreds to thousands of accelerators.
- UALink Consortium has grown to over 65 member companies since its October 2024 incorporation
- UALink aims to provide an open alternative to proprietary solutions like NVIDIA’s NVLink.
- The first UALink specification (1.0) is expected in Q1 2025, enabling connections of up to 1,024 accelerators at speeds of 200Gbps per lane.
The AI industry’s push toward open standards gained momentum today as three tech powerhouses joined the board of directors of the Ultra Accelerator Link Consortium. Apple, Alibaba, and Synopsys will now help guide the development of UALink, an open specification for connecting AI accelerators in data centers.
This development marks a strategic shift in how the tech industry approaches AI infrastructure. While companies like NVIDIA have built successful ecosystems around proprietary technologies such as NVLink, the growing UALink Consortium signals a strong appetite for open standards in AI hardware connectivity.
“UALink shows great promise in addressing connectivity challenges and creating new opportunities for expanding AI capabilities and demands,” says Becky Loop, Director of Platform Architecture at Apple, speaking about the company’s decision to join the board.
With AI workloads becoming increasingly demanding, the need for efficient communication between accelerators has never been more critical. The consortium’s upcoming UALink 1.0 specification, expected in the first quarter of 2025, aims to enable connections of up to 1,024 accelerators within an AI pod, with speeds reaching 200Gbps per lane.
Qiang Liu, VP of Alibaba Cloud, emphasizes the cloud computing perspective: “Driving AI computing accelerator scale-up interconnection technology has significant value in building the competitiveness of intelligent computing supernodes.”
The consortium’s growth to over 65 members since its October 2024 incorporation suggests broad industry support. Current members include major players like AMD, Intel, Google, AWS, and Microsoft, though notably absent is NVIDIA, which continues to develop its proprietary solutions.
For businesses investing in AI infrastructure, this development could mean more choices and potentially lower costs in the future. Open standards typically lead to increased competition among hardware providers and more interoperable solutions.
The addition of Synopsys, a leading provider of semiconductor IP, brings crucial technical expertise to the board. “UALink will be critical in addressing the performance and bandwidth communication demands of hyperscale data centers,” notes Richard Solomon, UALink Board Member and Sr. Staff Product Manager at Synopsys.
As AI continues to transform industries, the race to establish standards for hardware connectivity could determine how quickly and cost-effectively organizations can scale their AI operations. With major tech players now aligned behind UALink, the industry appears to be voting for openness and interoperability in the AI infrastructure of tomorrow.