OpenAI CEO Sam Altman has revealed that customers speaking politely to ChatGPT, saying "thank you" and "please", is actually costing the company millions of dollars. Still, he considers it money well spent.
Altman was responding to a post on X (formerly Twitter), which read, "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models."
Altman replied, "Tens of millions of dollars well spent—you never know."
When a company trains an AI model to do things like recognize images or understand language, it uses massive datasets and powerful hardware like GPUs (graphics processing units), TPUs (tensor processing units), or other high-performance chips. This process consumes a large amount of electricity, training large models like GPT can require hundreds of megawatt-hours.
AI hardware also generates significant heat during training. Cooling systems, like air conditioning or liquid cooling, can consume nearly as much electricity as the computing itself.
For context, training GPT-3 reportedly used about 1,287 megawatt-hours (MWh) of electricity, roughly enough to power 120 US homes for a year.
Earlier this month, OpenAI released its latest reasoning models, o3 and o4-mini. These new "o-series" models are described as the most advanced yet. They can answer questions using all of ChatGPT’s tools, web browsing, Python coding, and image analysis. With the addition of custom user tools, OpenAI is inching closer to its goal of enabling ChatGPT to complete tasks independently.
Altman was responding to a post on X (formerly Twitter), which read, "I wonder how much money OpenAI has lost in electricity costs from people saying 'please' and 'thank you' to their models."
Altman replied, "Tens of millions of dollars well spent—you never know."
When a company trains an AI model to do things like recognize images or understand language, it uses massive datasets and powerful hardware like GPUs (graphics processing units), TPUs (tensor processing units), or other high-performance chips. This process consumes a large amount of electricity, training large models like GPT can require hundreds of megawatt-hours.
AI hardware also generates significant heat during training. Cooling systems, like air conditioning or liquid cooling, can consume nearly as much electricity as the computing itself.
For context, training GPT-3 reportedly used about 1,287 megawatt-hours (MWh) of electricity, roughly enough to power 120 US homes for a year.
Earlier this month, OpenAI released its latest reasoning models, o3 and o4-mini. These new "o-series" models are described as the most advanced yet. They can answer questions using all of ChatGPT’s tools, web browsing, Python coding, and image analysis. With the addition of custom user tools, OpenAI is inching closer to its goal of enabling ChatGPT to complete tasks independently.
You may also like
Security forces recover arms, ammunition during joint operations in Manipur
Over 970 sign 'anti-tariff declaration' against Trump's tariff policy: Media reports
LIV stars struggle due to 'fail-free tour' as legend dismisses need for PGA merger
Drivers warned of items left in car that could spark £4k damage in hot weather
Beautiful little UK seaside village looks like it could be in the Cotswolds