The NVIDIA DGX H100 is the foundational building block for enterprise AI, engineered to handle the most complex AI workloads including generative AI and large language models. This liquid-cooling ready version is optimized for high-density data center environments requiring advanced thermal management.
* 8x NVIDIA H100 Tensor Core GPUs delivering 32 petaFLOPS of FP8 performance.
* 640GB total GPU memory with 3.35TB/s of memory bandwidth.
* Fourth-generation NVIDIA NVLink providing 900GB/s of GPU-to-GPU communication.
* Dual Intel Xeon Platinum 8480C processors with 112 total cores.
* 2TB of system memory to support massive data processing.
* 10x NVIDIA ConnectX-7 400Gb/s network interfaces for high-speed scalability.
* 30TB of NVMe SSD storage for high-throughput data access.
* Integrated NVIDIA AI Enterprise software suite for end-to-end AI development.
* Liquid-cooling ready chassis for superior heat dissipation in dense racks.
* Standard 19-inch rackmount form factor at 8U height.
NVIDIA
NVIDIA - DGX H100 Liquid Cooling Ready - AI Infrastructure Server
For a quote, contact us at info@tropical.com
- SKU:
- DGX H100 Liquid Cooling Ready
- Weight:
- 287.60 LBS
- Width:
- 19.00 (in)
- Height:
- 14.00 (in)
- Depth:
- 35.30 (in)
Related Products
A relatable product