Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> the takeaway is that ordinary Google searching is also quite energy-intense.

A related takeaway should be that machine inference is pervasive and has been for years, and that defining "AI" to mean just chatbots is to ignore most of the iceberg.





I'd still love to see a report that accurately captures training cost. Today's report[1] notably excludes training cost.

Not just "one training run," but the cost of a thousand AI engineers starting failing runs to get to that one deployed model.

1: Link to Google's tech report: https://services.google.com/fh/files/misc/measuring_the_envi... "We leave the measurement of AI model training to future work."


> I'd still love to see a report that accurately captures training cost. Today's report[1] notably excludes training cost.

From 2022, so possibly out of date: "ML training and inference are only 10%–15% of Google’s total energy use for each of the last three years, each year split ⅗ for inference and ⅖ for training." That's probably close enough to estimate 50/50, or the full energy cost to deliver an AI result is double the inference energy.

https://research.google/blog/good-news-about-the-carbon-foot...


It still kills me, every time, that the title embedded in the metadata of that original PDF is "Revamped Happy CO2e Paper".

My gosh you're right! The paper in question is https://arxiv.org/pdf/2204.05149, "The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink"



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: