Dell Launches PowerEdge Servers for AI Workloads

Posted on May 16 2023 - 10:39am by Maria Campos
PowerEdge Servers for AI Workloads

Dell Technologies has announced the release of its latest PowerEdge rack configuration servers, designed to support the ever-expanding realm of artificial intelligence (AI) workloads across the edge and core. These advanced servers boast GPU-optimized models that can effectively handle diverse AI use cases, including edge AI and telecom applications.

Recognizing the escalating demand for infrastructure to cater to AI-related applications, Dell Technologies has developed the PowerEdge servers to provide a robust foundation for the continued growth of this transformative technology. With AI revolutionizing various industries, having the right infrastructure becomes paramount.

The new PowerEdge servers offer customers a range of cooling options, enabling easy installation both inside and outside the data center. Among these options are the eight-way NVLink peer-to-peer air-cooled, four-way NVLink peer-to-peer in liquid-assist air-cooled, and direct liquid-cooled solutions. This flexibility empowers customers to evaluate their cooling needs and make informed decisions while planning their infrastructure expansion for AI-related applications.

A major highlight of the latest PowerEdge servers lies in their significant performance enhancements. The XE9680 model, equipped with eight Nvidia H100 GPUs and NVLink, has demonstrated an astounding eightfold improvement in machine learning (ML) performance testing compared to its predecessor. This exceptional server excels in demanding AI training, generative AI model training and fine-tuning, as well as AI inferencing metrics.

The R760xa and XR5610 models, equipped with Nvidia H100 and Nvidia L4 respectively, have also proven their mettle in data center and edge inferencing. These models exhibit impressive performance-to-watt ratios, making them ideal for edge applications. Such advancements in GPU-optimized servers, coupled with Intel’s Xeon processors, lay the foundation for the development of cutting-edge AI training and inferencing software, generative AI models, AI DevOps tools, and AI applications.

Additionally, the 4th Generation Intel Xeon scalable processors bring significant improvements to AI workloads. The R760 model, powered by these processors, utilizes AMX technology to deliver up to an eightfold enhancement in inference throughput. Developers can leverage AMX to elevate the performance of AI workloads while simultaneously utilizing the ISA instruction set for non-AI tasks.

These infrastructure advancements are essential to meet the escalating demand for AI and address the growing complexity of models and workloads being developed. Dell Technologies’ latest release of AI- and ML-enabled platforms offers end-users the much-needed flexibility to create AI applications spanning from the core to the edge.

In conclusion, Dell Technologies’ next-generation PowerEdge servers pave the way for a more efficient and powerful AI future, both within and beyond traditional data centers. These vital infrastructure advancements cater to the rising demand for AI while accommodating the evolving requirements of complex AI models and workloads.