> the takeaway is that ordinary Google searching is also quite energy-intense.
A related takeaway should be that machine inference is pervasive and has been for years, and that defining "AI" to mean just chatbots is to ignore most of the iceberg.
> I'd still love to see a report that accurately captures training cost. Today's report[1] notably excludes training cost.
From 2022, so possibly out of date: "ML training and inference are only 10%–15% of Google’s total energy use for each of the last three years, each year split ⅗ for inference and ⅖ for training." That's probably close enough to estimate 50/50, or the full energy cost to deliver an AI result is double the inference energy.
My gosh you're right! The paper in question is https://arxiv.org/pdf/2204.05149, "The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink"
A related takeaway should be that machine inference is pervasive and has been for years, and that defining "AI" to mean just chatbots is to ignore most of the iceberg.