Energy Consumption of ChatGPT: A Closer Look
Recent findings suggest ChatGPT’s energy use is less than previously thought, varying with usage and AI models.

So, here’s the scoop on ChatGPT’s energy use—turns out, it’s not the power hog we all thought. OpenAI’s brainchild might actually be sipping electricity rather than guzzling it, especially with the latest GPT-4o model. Epoch AI, those nonprofit number crunchers, took a closer look and found that a single query might only use about 0.3 watt-hours. That’s a tenth of the old estimate of 3 watt-hours. Talk about an energy diet!
Joshua You from Epoch AI put it into perspective: your toaster probably uses more juice than ChatGPT does answering your midnight existential questions. This comes at a time when everyone’s fretting over AI’s carbon footprint. Over a hundred groups are waving red flags, begging the industry and regulators to keep new data centers from turning into energy vampires.
Why the mix-up? Earlier studies were a bit behind the times, basing their numbers on older, clunkier chips. Epoch’s 0.3 watt-hour figure isn’t set in stone—it’s more of a ballpark—but it’s a wake-up call to get our facts straight. And hey, this doesn’t even cover the extra energy for fancy stuff like drawing pictures or digesting your novel-length prompts.
Looking forward, You’s betting that as AI gets smarter (and let’s be honest, more high-maintenance), its energy needs will climb. Even with efficiency gains, the sheer scale of AI’s growth could mean data centers eating up almost all of California’s 2022 power supply in a couple of years. Yikes.
OpenAI and friends are throwing billions at new data centers, aiming for models that can reason like a philosopher but, yeah, they’ll need more computing muscle. You’s advice? If you’re sweating over your AI carbon footprint, maybe don’t ask ChatGPT to write your memoirs—stick to the essentials and lean towards lighter models when you can.