NVIDIA - DGX GB200 - Rack-Scale AI Infrastructure System

For a quote, contact us at info@tropical.com


SKU:
DGX GB200
Weight:
2,850.00 LBS
The NVIDIA DGX GB200 is a rack-scale AI factory purpose-built for training and inferencing trillion-parameter generative AI models. It integrates 36 NVIDIA GB200 Grace Blackwell Superchips into a single liquid-cooled system, providing massive supercomputing performance for mission-critical enterprise AI workloads. - Features 72 NVIDIA Blackwell GPUs and 36 NVIDIA Grace CPUs with 2,592 Arm Neoverse V2 cores. - Delivers up to 1,440 PFLOPS of FP4 Tensor Core performance for state-of-the-art AI models. - Provides up to 13.4 TB of HBM3e GPU memory with a massive 576 TB/s aggregate bandwidth. - Utilizes fifth-generation NVLink to achieve 1.8 TB/s of GPU-to-GPU bidirectional bandwidth. - Includes 72x OSFP single-port NVIDIA ConnectX-7 VPI adapters supporting 400 Gb/s InfiniBand. - Equipped with 36x dual-port NVIDIA BlueField-3 VPI adapters for 200 Gb/s InfiniBand and Ethernet. - Features 9x L1 NVIDIA NVLink Switches for high-speed, low-latency interconnectivity. - Managed via NVIDIA Mission Control for streamlined AI factory operations and resilience. - Optimized for liquid cooling to ensure maximum efficiency in high-density deployments. - Fully supported by NVIDIA AI Enterprise and DGX OS for a production-ready software stack.

Related Products

A relatable product