AI usage power consumption is now similar to many small countries


New figures from French energy management company Schneider Electric claim artificial intelligence (AI) is now consuming an estimated 4.3GW of power globally, almost as much as some small countries.
As adoption of the technology increases, so will its power usage. By 2028, Schneider Electric reckons AI will account for between 13.5-20GW, representing 26-36% compound annual growth.
The study also reveals the power intensity of data centers more generally, presenting an eye-opening reality for which we should prepare in two key ways: upgrading infrastructure and improving efficiency.
AI adding to data center power usage
Currently, artificial intelligence makes up just 8% of a typical data center’s power consumption, which sits at a total of 54GW. By 2028, the firm expects data center usage to climb to 90GW, with artificial intelligence making up around 15-20% of this.
The company also noted the current 20:80 split represented by AI training and inference respectively, which is expected to become more inference-heavy in the forthcoming years.
Also reported in the paper is the requirement for cooling – excess heat presents a safety hazard, but it can also lead to premature component failure. Cooling not only requires an additional surge of electricity to power the process, but it often correlates to high water usage, too. Data centers have long been criticized for their use of natural resources, which in some cases, see them having to reroute or otherwise modify water courses. This is because air cooling is not sufficient in the case of large clusters, which would otherwise become dangerously hot.
Looking ahead, Schenider says accurately predicting energy usage will become more challenging as high-energy training makes room for inference workloads, which can have a much more variable power requirement.
The company also offers advice to data center operators looking to capitalize on the latest AI hardware: transitioning away from a conventional 120/208V distribution to 240/415V should allow them to accommodate the high power densities of AI workloads.
Clearly, the work of upgrading infrastructure must go hand-in-hand with a revision of the current trajectory in order to tame power usage and make cloud computing and AI workloads even more efficient.
More from TechRadar Pro
New figures from French energy management company Schneider Electric claim artificial intelligence (AI) is now consuming an estimated 4.3GW of power globally, almost as much as some small countries. As adoption of the technology increases, so will its power usage. By 2028, Schneider Electric reckons AI will account for between…
Recent Posts
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010