NVIDIA A100 GPU Computing Accelerator, 40 GB HBM2e, PCIe 4.0 x16

NVIDIA A100 GPU Computing Accelerator, 40 GB HBM2e, PCIe 4.0 x16

Nhà sản xuất: Nvidia
Thời gian bảo hành : 3 Year
Hỗ trợ kỹ thuật 24/7
Gọi để biết giá
Architecture Ampere
Process Size 7nm | TSMC
Transistors 54 Billion
Die Size 826 mm2
CUDA Cores 6912
Streaming Multiprocessors 108
Tensor Cores | Gen 3 432
Multi-Instance GPU (MIG) Support Yes, up to seven instances per GPU
FP64 9.7 TFLOPS
FP64 Tensor Core 19.5 TFLOPS
FP32 19.5 TFLOPS
TF32 Tensor Core 156 TFLOPS | 312 TFLOPS*
BFLOAT16 Tensor Core 312 TFLOPS | 624 TFLOPS*
FP16 Tensor Core 312 TFLOPS | 624 TFLOPS*
INT8 Tensor Core 624 TOPS | 1248 TOPS*
INT4 Tensor Core 1248 TOPS | 2496 TOPS*
NVLink 2-Way Low Profile, 2-Slot
NVLink Interconnect 600 GB/s Bidirectional
GPU Memory 40 GB HBM2e
Memory Interface 5120-bit
Memory Bandwidth 1555 GB/s
System Interface PCIe 4.0 x16
Thermal Solution Passive
vGPU Support NVIDIA Virtual Compute Server with MIG support
Secure and Measured Boot Hardware Root of Trust CEC 1712
NEBS Ready Level 3
Power Connector 8-pin CPU
Maximum Power Consumption 250 W
Xem thêm
Hotline