NEW YORK--(BUSINESS WIRE)--Arista Networks (NYSE: ANET), a leading provider of cloud and AI networking solutions, today announced the Arista Etherlink™ AI platforms, designed to deliver optimal network performance for the most demanding AI workloads, including training and inferencing.
Powered by new AI-optimized Arista EOS features, the Arista Etherlink AI portfolio supports AI cluster sizes ranging from thousands to 100,000s of XPUs with highly efficient one and 2-tier network topologies that deliver superior application performance compared to more complex multi-tier networks while offering advanced monitoring capabilities including flow-level visibility.
“The network is core to successful job completion outcomes in AI clusters,” said Alan Weckel, Founder and Technology Analyst for 650 Group. “The Arista Etherlink AI platforms offer customers the ability to have a single 800G end-to-end technology platform across front-end, training, inference, and storage networks. Customers benefit from leveraging the same well-proven Ethernet tooling, security, and expertise they have relied on for decades while easily scaling up for any AI application.”
Arista’s Etherlink AI Platforms
The 7060X6 AI Leaf switch family employs Broadcom Tomahawk 5® silicon, with a capacity of 51.2 Tbps and support for 64 800G or 128 400G Ethernet ports.
The 7800R4 AI Spine is the 4th generation of Arista’s flagship 7800 modular systems. It implements the latest Broadcom Jericho3-AI processors with an AI-optimized packet pipeline and offers non-blocking throughput with the proven virtual output queuing architecture. The 7800R4-AI supports up to 460 Tbps in a single chassis, which corresponds to 576 800G or 1152 400G Ethernet ports.
The 7700R4 AI Distributed Etherlink Switch (DES) supports the largest AI clusters, offering customers massively parallel distributed scheduling and congestion-free traffic spraying based on the Jericho3-AI architecture. The 7700 represents the first in a new series of ultra-scalable, intelligent distributed systems that can deliver the highest consistent throughput for very large AI clusters.
A single-tier network topology with Etherlink platforms can support over 10,000 XPUs. With a 2-tier network, Etherlink can support more than 100,000 XPUs. Minimizing the number of network tiers is essential for optimizing AI application performance, reducing the number of optical transceivers, lowering cost and improving reliability.
All Etherlink switches support the emerging Ultra Ethernet Consortium (UEC) standards, which are expected to provide additional performance benefits when UEC NICs become available in the near future.
“Broadcom is a firm believer in the versatility, performance, and robustness of Ethernet, which makes it the technology of choice for AI workloads,” said Ram Velaga, senior vice president and general manager, Core Switching Group, Broadcom. “By leveraging industry-leading Ethernet chips such as Tomahawk 5 and Jericho3-AI, Arista provides the ideal accelerator-agnostic solution for AI clusters of any shape or size, outperforming proprietary technologies and providing flexible options for fixed, modular, and distributed switching platforms.”
Arista EOS Smart AI Suite
The rich features of Arista EOS and CloudVision complement these new networking-for-AI platforms. The innovative software suite for AI-for-networking, security, segmentation, visibility, and telemetry features brings AI-grade robustness and protection to high-value AI clusters and workloads. For example, Arista EOS’s Smart AI suite of innovative enhancements now integrates with SmartNIC providers to deliver advanced RDMA-aware load balancing and QoS. Arista AI Analyzer powered by Arista AVA™ automates configuration and improves visibility and intelligent performance analysis of AI workloads.
“Arista’s competitive advantage consistently comes down to our rich operating system and broad product portfolio to address AI networks of all sizes,” said Hugh Holbrook, Chief Development Officer, Arista Networks. “Innovative AI-optimized EOS features enable faster deployment, reduce configuration issues and deliver flow-level performance analysis, and improve AI job completion times for any size AI cluster.”
Availability
The 7060X6 is available now. The 7800R4-AI and 7700R4 DES are in customer testing and will be available 2H 2024.
About Arista
Arista Networks is an industry leader in data-driven, client-to-cloud networking for large AI, data center, campus and routing environments. Its award-winning platforms deliver availability, agility, automation, analytics, and security through an advanced network operating stack. For more information, visit www.arista.com.
ARISTA, AGNI, AVA, CloudVision, EOS, Etherlink, MSS, and NetDL are among the registered and unregistered trademarks of Arista Networks in jurisdictions worldwide. Other company names or product names may be trademarks of their respective owners. Additional information and resources can be found at www.arista.com. This press release contains forward-looking statements including, but not limited to, statements regarding the performance and capabilities of Arista’s products and services. All statements other than statements of historical fact are statements that could be deemed forward-looking statements. Forward-looking statements are subject to risks and uncertainties that could cause actual performance or results to differ materially from those expressed in the forward-looking statements, including rapid technological and market change, customer requirements, and industry standards, as well as other risks stated in our filings with the SEC available on Arista's website at www.arista.com and the SEC's website at www.sec.gov. Arista disclaims any obligation to publicly update or revise any forward-looking statement to reflect events that occur or circumstances that exist after the date on which they were made.