NVIDIA A100 GPU Computing Accelerator, 40 GB HBM2e, PCIe 4.0 x16

NVIDIA A100 GPU Computing Accelerator, 40 GB HBM2e, PCIe 4.0 x16

Nhà sản xuất: Nvidia
Thời gian bảo hành : 3 Year
Hỗ trợ kỹ thuật 24/7
Gọi để biết giá
ArchitectureAmpere
Process Size7nm | TSMC
Transistors54 Billion
Die Size826 mm2
CUDA Cores6912
Streaming Multiprocessors108
Tensor Cores | Gen 3432
Multi-Instance GPU (MIG) SupportYes, up to seven instances per GPU
FP649.7 TFLOPS
FP64 Tensor Core19.5 TFLOPS
FP3219.5 TFLOPS
TF32 Tensor Core156 TFLOPS | 312 TFLOPS*
BFLOAT16 Tensor Core312 TFLOPS | 624 TFLOPS*
FP16 Tensor Core312 TFLOPS | 624 TFLOPS*
INT8 Tensor Core624 TOPS | 1248 TOPS*
INT4 Tensor Core1248 TOPS | 2496 TOPS*
NVLink2-Way Low Profile, 2-Slot
NVLink Interconnect600 GB/s Bidirectional
GPU Memory40 GB HBM2e
Memory Interface5120-bit
Memory Bandwidth1555 GB/s
System InterfacePCIe 4.0 x16
Thermal SolutionPassive
vGPU SupportNVIDIA Virtual Compute Server with MIG support
Secure and Measured Boot Hardware Root of TrustCEC 1712
NEBS ReadyLevel 3
Power Connector8-pin CPU
Maximum Power Consumption250 W
Xem thêm