AI is resource intensive for any platform, including public clouds. Most AI technology requires numerous inference calculations that add up to higher processor, network, and storage requirements—and higher power bills, infrastructure costs, and carbon footprints.
The rise of generative AI systems, such as ChatGPT, has brought this issue to the forefront again. Given the popularity of this technology and the likely massive expansion of its use by companies, governments, and the public, we could see the power consumption growth curve take on a concerning arc.
AI has been viable since the 1970s but did not have much business impact initially, given the number of resources needed for a full-blown AI system to work. I remember designing AI-enabled systems in my 20s that would have required more than $40 million in hardware, software, and data center space to get it running. Spoiler alert: That project and many other AI projects never saw a release date. The business cases just did not work.
To read this article in full, please click here
InfoWorld Cloud ComputingRead More