Intel Launches 4th Gen Xeon Scalable Processors, Max Series CPUs and GPUs
Intel today marked one of the most important product launches in company history with the unveiling of 4th Gen Intel® Xeon® Scalable processors (code-named Sapphire Rapids), the Intel® Xeon® CPU Max Series (code-named Sapphire Rapids HBM) and the Intel® Data Center GPU Max Series (code-named Ponte Vecchio), delivering for its customers a leap in data center performance, efficiency, security and new capabilities for AI, the cloud, the network and edge, and the world's most powerful supercomputers.
This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20230110005454/en/
On Jan. 10, 2023, Intel introduced 4th Gen Intel Xeon Scalable processors, expanding on its purpose-built, workload-first strategy and approach. (Credit: Intel Corporation)
Working alongside its customers and partners with 4th Gen Xeon, Intel is delivering differentiated solutions and systems at scale to tackle their biggest computing challenges. Intel's unique approach to providing purpose-built, workload-first acceleration and highly optimized software tuned for specific workloads enables the company to deliver the right performance at the right power for optimal overall total cost of ownership.
Press Kit: 4th Gen Xeon Scalable Processors
Additionally, as Intel's most sustainable data center processors, 4th Gen Xeon processors deliver customers a range of features for managing power and performance, making the optimal use of CPU resources to help achieve their sustainability goals.
"The launch of 4th Gen Xeon Scalable processors and the Max Series product family is a pivotal moment in fueling Intel's turnaround, reigniting our path to leadership in the data center and growing our footprint in new arenas," said Sandra Rivera, Intel executive vice president and general manager of the Data Center and AI Group. "Intel's 4th Gen Xeon and the Max Series product family deliver what customers truly want - leadership performance and reliability within a secure environment for their real-world requirements - driving faster time to value and powering their pace of innovation."
Unlike any other data center processor on the market and already in the hands of customers today, the 4th Gen Xeon family greatly expands on Intel's purpose-built, workload-first strategy and approach.
Leading Performance and Sustainability Benefits with the Most Built-In Acceleration
Today, there are over 100 million Xeons installed in the market - from on-prem servers running IT services, including new as-a-service business models, to networking equipment managing Internet traffic, to wireless base station computing at the edge, to cloud services.
Building on decades of data center, network and intelligent edge innovation and leadership, new 4th Gen Xeon processors deliver leading performance with the most built-in accelerators of any CPU in the world to tackle customers' most important computing challenges across AI, analytics, networking, security, storage and HPC.
When comparing with prior generations, 4th Gen Intel Xeon customers can expect a 2.9x1 average performance per watt efficiency improvement for targeted workloads when utilizing built-in accelerators, up to 70-watt2 power savings per CPU in optimized power mode with minimal performance loss, and a 52% to 66% lower TCO3.
High Performance Computing
The Xeon CPU Max Series is the first and only x86-based processor with high bandwidth memory, accelerating many HPC workloads without the need for code changes. The Intel Data Center GPU Max Series is Intel's highest-density processor and will be available in several form factors that address different customer needs.
The Xeon CPU Max Series offers 64 gigabytes of high bandwidth memory (HBM2e) on the package, significantly increasing data throughput for HPC and AI workloads. Compared with top-end 3rd Gen Intel® Xeon® Scalable processors, the Xeon CPU Max Series provides up to 3.7 times10 more performance on a range of real-world applications like energy and earth systems modeling.
Further, the Data Center GPU Max Series packs over 100 billion transistors into a 47-tile package, bringing new levels of throughput to challenging workloads like physics, financial services and life sciences. When paired with the Xeon CPU Max Series, the combined platform achieves up to 12.8 times13 greater performance than the prior generation when running the LAMMPS molecular dynamics simulator.
Most Feature-Rich and Secure Xeon Platform Yet
Signifying the biggest platform transformation Intel has delivered, not only is 4th Gen Xeon a marvel of acceleration, but it is also an achievement in manufacturing, combining up to four Intel 7-built tiles on a single package, connected using Intel EMIB (embedded multi-die interconnect bridge) packaging technology and delivering new features including increased memory bandwidth with DDR5, increased I/O bandwidth with PCIe5.0 and Compute Express Link (CXL) 1.1 interconnect.
At the foundation of it all is security. With 4th Gen Xeon, Intel is delivering the most comprehensive confidential computing portfolio of any data center silicon provider in the industry, enhancing data security, regulatory compliance and data sovereignty. Intel remains the only silicon provider to offer application isolation for data center computing with Intel® Software Guard Extensions (Intel® SGX), which provides today's smallest attack surface for confidential computing in private, public and cloud-to-edge environments. Additionally, Intel's new virtual-machine (VM) isolation technology, Intel® Trust Domain Extensions (Intel® TDX), is ideal for porting existing applications into a confidential environment and will debut with Microsoft Azure, Alibaba Cloud, Google Cloud and IBM Cloud.
Finally, the modular architecture of 4th Gen Xeon allows Intel to offer a wide range of processors across nearly 50 targeted SKUs for customer use cases or applications, from mainstream general-purpose SKUs to purpose-built SKUs for cloud, database and analytics, networking, storage, and single-socket edge use cases. The 4th Gen Xeon processor family is On Demand-capable and varies in core count, frequency, mix of accelerators, power envelope and memory throughput as is appropriate for target use cases and form factors addressing customers' real-world requirements.
Intel (Nasdaq: INTC) is an industry leader, creating world-changing technology that enables global progress and enriches lives. Inspired by Moore's Law, we continuously work to advance the design and manufacturing of semiconductors to help address our customers' greatest challenges. By embedding intelligence in the cloud, network, edge and every kind of computing device, we unleash the potential of data to transform business and society for the better. To learn more about Intel's innovations, go to newsroom.intel.com and intel.com.
1 Geomean of following workloads: RocksDB (IAA vs ZTD), ClickHouse (IAA vs ZTD), SPDK large media and database request proxies (DSA vs out of box), Image Classification ResNet-50 (AMX vs VNNI), Object Detection SSD-ResNet-34 (AMX vs VNNI), QATzip (QAT vs zlib)?
2 1-node, Intel Reference Validation Platform, 2x Intel® Xeon 8480+ (56C, 2GHz, 350W TDP), HT On, Turbo ON, Total Memory: 1 TB (16 slots/ 64GB/ 4800 MHz), 1x P4510 3.84TB NVMe PCIe Gen4 drive, BIOS: 0091.D05, (ucode:0x2b0000c0), CentOS Stream 8, 5.15.0-spr.bkc.pc.10.4.11.x86_64, Java Perf/Watt w/ openjdk-11+28_linux-x64_bin, 112 instances, 1550MB Initial/Max heap size, Tested by Intel as of Oct 2022.?
3 ResNet50 Image Classification?
?For a 17 server fleet of 4th Gen Xeon 8490H (RN50 w/AMX), estimated as of November 2022:
AI -- 55% lower TCO by deploying fewer 4th Gen Intel® Xeon® processor-based servers to meet the same performance requirement. See [E7] at intel.com/processorclaims: 4th Gen Intel Xeon Scalable processors. Results may vary.
4 Geomean of HP Linpack, Stream Triad, SPECrate2017_fp_base est, SPECrate2017_int_base est. See [G2, G4, G6] at intel.com/processorclaims: 4th Gen Intel Xeon Scalable.?
5 Up to 10x higher PyTorch real-time inference performance with built-in Intel® Advanced Matrix Extensions (Intel® AMX) (BF16) vs. the prior generation (FP32)?
6 Up to 10x higher PyTorch training performance with built-in Intel® Advanced Matrix Extensions (Intel® AMX) (BF16) vs. the prior generation (FP32)?
?7 Estimated as of 8/30/2022 based on 4th generation Intel® Xeon® Scalable processor architecture improvements vs 3rd generation Intel® Xeon® Scalable processor at similar core count, socket power and frequency on a test scenario using FlexRAN™ software. Results may vary. ?
?8 Up to 95% fewer cores and 2x higher level 1 compression throughput with 4th Gen Intel Xeon Platinum 8490H using integrated Intel QAT vs. prior generation.?
?9 Up to 3x higher RocksDB performance with 4th Gen Intel Xeon Platinum 8490H using integrated Intel IAA vs. prior generation.?
10 Intel® Xeon® 8380: Test by Intel as of 10/7/2022. 1-node, 2x Intel® Xeon® 8380 CPU, HT On, Turbo On, Total Memory 256 GB (16x16GB 3200MT/s DDR4), BIOS Version SE5C620.86B.01.01.0006.2207150335, ucode revision=0xd000375, Rocky Linux 8.6, Linux version 4.18.0-372.26.1.el8_?6.crt1.x86_?64, YASK v3.05.07?
11 Up to 20% system power savings utilizing 4th Gen Xeon Scalable with Optimized Power mode on vs off on select workloads including SpecJBB, SPECINT and NIGNX key handshake.?
12 AMD Milan: Tested by Numenta as of 11/28/2022. 1-node, 2x AMD EPYC 7R13 on AWS m6a.48xlarge, 768 GB DDR4-3200, Ubuntu 20.04 Kernel 5.15, OpenVINO 2022.3, BERT-Large, Sequence Length 512, Batch Size 1?
Intel® Xeon® 8480+: Tested by Numenta as of 11/28/2022. 1-node, 2x Intel® Xeon® 8480+, 512 GB DDR5-4800, Ubuntu 22.04 Kernel 5.17, OpenVINO 2022.3, Numenta-Optimized BERT-Large, Sequence Length 512, Batch Size 1?
Intel® Xeon® Max 9468: Tested by Numenta as of 11/30/2022. 1-node, 2x Intel® Xeon® Max 9468, 128 GB HBM2e 3200 MT/s, Ubuntu 22.04 Kernel 5.15, OpenVINO 2022.3, Numenta-Optimized BERT-Large, Sequence Length 512, Batch Size 1
13 Intel® Xeon® 8380: Test by Intel as of 10/28/2022. 1-node, 2x Intel® Xeon® 8380 CPU, HT On, Turbo On, Total Memory 256 GB (16x16GB 3200MT/s, Dual-Rank), BIOS Version SE5C6200.86B.0020.P23.2103261309, ucode revision=0xd000270, Rocky Linux 8.6, Linux version 4.18.0-372.19.1.el8_6.crt1.x86_64?
Intel® Data Center GPU Max Series with DDR Host: Test by Intel as of 10/28/2022. 1-node, 2x Intel® Xeon® Max 9480, HT On, Turbo On, Total Memory 1024 GB DDR5-4800 + 128 GB HBM2e, Memory Mode: Flat, HBM2e not used, 6x Intel® Data Center GPU Max Series, BIOS EGSDCRB1.DWR.0085.D12.2207281916, ucode 0xac000040, Agama pvc-prq-54, SUSE Linux Enterprise Server 15 SP3, Kernel 5.3.18, oneAPI 2022.3.0?
Intel® Data Center GPU Max Series with HBM Host: Test by Intel as of 10/28/2022. 1-node, 2x Intel® Xeon® Max 9480, HT On, Turbo On, Total Memory 128 GB HBM2e, 6x Intel® Data Center GPU Max Series, BIOS EGSDCRB1.DWR.0085.D12.2207281916, ucode 0xac000040, Agama pvc-prq-54, SUSE Linux Enterprise Server 15 SP3, Kernel 5.3.18, oneAPI 2022.3.0?
© Intel Corporation. Intel, the Intel logo and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.
Managing Cloud Costs â€“ The Rise of FinOps
Cybersecurity Solutions Session
Grand Opening Reception in Expo Hall