Data center operators can now use standardized, reproducible power measurements of AI workloads to accurately plan electrical infrastructure, cooling systems, and grid connections instead of relying on proprietary estimates.
This paper measures the power consumption of generative AI workloads (training, fine-tuning, and inference) at high resolution using NVIDIA H100 GPUs, then scales those measurements to predict whole data center energy demand. The work provides publicly available power profiles and a methodology for infrastructure planning.