
AI networks have traditionally relied on conventional ethernet leaf-and-spine architecture, which was not specifically designed to support high-performance AI workloads at scale. This mismatch has resulted in a lack of synchronization between the computing power available and networks' abilities to handle the increased data flow. Consequently, AI workloads were unable to achieve their maximum potential, leading to inefficiencies and missed opportunities for breakthrough discoveries.
Even the approach that involved the use of proprietary solutions, such as NVIDIA (News - Alert)'s InfiniBand, provided high performance but suffered from limitations. InfiniBand lacked network interoperability and flexibility, presenting a challenge for hyperscalers seeking to avoid becoming locked into a single vendor's ecosystem. This restricted the ability to seamlessly integrate diverse hardware and software components, hindering the scalability and adaptability required by rapidly growing AI clusters.
Recognizing these challenges, a new generation of network solutions are emerging to revolutionize the way AI workloads are supported and managed. One of those solutions is DriveNets’ Network Cloud-AI, an innovative AI networking solution designed to maximize the utilization of AI infrastructures and improve the performance of large-scale AI workloads.
Network Cloud-AI, built on DriveNets’ Network Cloud, was validated by renowned hyperscalers in recent trials as a cost-effective ethernet solution for AI networking. Leveraging the power of OCP (News - Alert)'s Distributed Disaggregated Chassis architecture, Network Cloud-AI is specifically designed to cater to the demands of high-scale service provider networks, offering an array of advantages.
With Network Cloud-AI, connectivity is taken to new heights, seamlessly linking up to 32,000 GPUs within a single AI cluster. This capability, combined with speeds ranging from 100G to 800G, ensures optimal load balancing and paves the way for massive AI workloads.
The distributed leaf-and-spine model of Network Cloud-AI excels at uniformly distributing traffic across the AI network fabric. This intelligent approach guarantees maximum network utilization, enabling AI workloads to perform at their peak potential without any packet loss, even under the most demanding circumstances.
DriveNets Network Cloud-AI implements an advanced traffic scheduling mechanism that operates in a congestion-free environment. By carefully avoiding flow collisions, this innovative solution reduces job completion time, facilitating faster and more efficient AI operations. Furthermore, with sub-10ms automatic path convergence, the network offers zero-impact failover, ensuring uninterrupted AI processing.
Embracing an ethernet-based approach, DriveNets Network Cloud-AI firmly rejects proprietary methods. Instead, it fosters a culture of openness and compatibility, seamlessly integrating with a wide range of white box manufacturers, network interface cards and AI accelerator ASICs. This vendor-agnostic design empowers organizations to build customized AI infrastructure using the components that suit their specific needs.
The DriveNets Network Cloud-AI solution is definitely a game-changer in the AI networking domain, redefining the boundaries of what is achievable.
"AI compute resources are extremely costly and must be fully utilized to avoid 'idle cycles' as they await networking tasks," said Ido Susan, DriveNets co-founder and CEO. "Leveraging our experience supporting the world's largest networks, we have developed DriveNets Network Cloud-AI. Network Cloud-AI has already achieved up to a 30% reduction in idle time in recent trials, enabling exponentially higher AI throughput compared to a standard Ethernet solution.”
Network Cloud-AI stands as the ideal solution to unlock the full potential of AI clusters and accelerate groundbreaking advancements across diverse industries. With this new offering, DriveNets is well-positioned to address the growing AI networking segment – a $10B market opportunity, according to 650 Group and Alan Weckel.
Edited by
Alex Passett