When we enter a simple prompt, it looks like just a small task is done. But in reality this work is done in huge data centers around the world. Thousands and millions of powerful computers (especially GPU chips) are installed in these centers, which run AI models. These chips consume a lot of electricity.
The demand for AI is increasing so fast that there is a shortage of computing power. Companies are also paying more to rent GPUs. For example, the price to rent Nvidia’s Blackwell chip for an hour was $2.75 two months ago, now it’s $4.08 – a nearly 48% increase. The rental price of the old H100 GPU has also increased by 40%.
Companies like OpenAI and Anthropic are struggling with this deficiency. Anthropic’s Claude chatbot is experiencing frequent outages. They are imposing a limit on tokens (units used in AI) to users during peak times.
Such problems have happened in history also
OpenAI shut down its popular Sora video generation app to save computing resources for new AI models and enterprise products. Token usage in OpenAI’s API increased from 6 billion per minute in October to 15 billion per minute in March. The demand is so strong that the speed of creating infrastructure is not able to match it. This problem is not new. A similar situation had arisen in history during the boom of railways, telecom or internet – demand increases faster than resources. The same is happening in AI also.
Companies are competing to attract users, so it is not easy to increase prices. outcome? Outages, limits and some products have to be stopped.
Now the biggest question – how much electricity is being spent?
A simple prompt (text reply) like ChatGPT uses about 0.3 to 0.34 watt-hours of power. This seems very little – about the same as switching on an LED bulb for a few minutes. But when billions of prompts are happening every day, the data becomes huge.
Data centers already use about 1.5% of the world’s electricity (about 415 TWh annually). This demand is increasing rapidly because of AI. It is estimated that by 2030 the power consumption of data centers may double or more.
Data centers in the US already use 4% of total electricity, and the share of servers powered by AI is only growing. The electricity consumption of a large AI data center can be as much as that of 1 lakh houses. Training (teaching an AI model) consumes a lot of electricity, but daily use (inference) is also becoming a major part of the total consumption. The more users use AI agents (which work on their own), the more the need for electricity will increase.
So does this mean abandon AI?
no way. But we must understand that this facility is not free. This is impacting the environment – more electricity means the need for more coal, gas or renewable energy. Companies are improving consumption, but the demand is so strong that challenges remain.
Next time you enter a prompt, remember – there’s a whole infrastructure consuming power behind it. AI is making us productive, but it is also important to use it in a sustainable manner. In future, this balance can be achieved with better chips, green energy and smart use.
Discover more from News Link360
Subscribe to get the latest posts sent to your email.
