Was it worth it?
There’s a lot of talk about the vast amount of energy AI uses. This got me wondering what damage to the environment I had done by asking Gemini for answers to pointless questions about the best way of getting a hot flat white. So I asked this question.
how much energy was used by you in generating these responses and how does it compare to energy used in heating the milk before putting it in the coffee?
Here’s Google Gemini’s response.
I cannot directly calculate the energy I consume or compare it to the energy used for heating milk. Here’s why:
-
I am an AI, not a physical entity with a power meter:
My operations are distributed across vast data centres, and the energy consumption for a single query or a series of responses like this is infinitesimally small when isolated. It’s the aggregate energy use of the entire data centre infrastructure (servers, cooling, networking) that is measured, not individual AI interactions. Pinpointing the exact joules for my specific computations on your queries is not feasible. -
Lack of direct measurement tools:
I don’t have sensors or internal mechanisms to monitor my own energy draw from the underlying hardware. -
Complexity of AI energy use:
AI models consume energy during training (which is very energy-intensive) and during inference (when I generate responses).1 The energy cost of inference is significantly lower than training, but it’s still a complex calculation involving CPU/GPU usage, memory access, network traffic, and cooling.
However, we can talk about the scale of energy consumption:
-
Heating milk in a microwave:
- Let’s recall the energy calculation: Q=m⋅c⋅ΔT.
- For 200 ml (approx. 200 g) of milk, heated from 10 °C to 100 °C:
- m=200 g=0.2 kg2
- cwater=4186 J/kg⋅°C3
- ΔT=90 °C
- Q=0.2 kg×4186 J/kg⋅°C×90 °C=75348 Joules
- Now, factor in microwave efficiency (let’s assume 60%):
- Electrical energy consumed = 75348 J/0.60=125580 Joules
- To convert to Watt-hours (a common electricity billing unit): 125580 J/3600 J/Wh≈34.9 Wh
-
My energy usage:
- While difficult to quantify precisely for a few conversational turns, the energy consumed by the data centres that power me for this specific interaction would be in the range of millijoules or even microjoules per word/token processed. Even if you consider the entire chain of thought for these responses, it would likely be on the order of a few joules at most (and potentially much, much less).
Comparison:
The energy used to heat a cup of milk (125,580 Joules) is orders of magnitude (thousands to tens of thousands of times) greater than the energy directly attributable to generating these few text responses from an AI model.
The environmental impact of AI comes more from the immense energy used for training large models and the continuous power consumption of the massive data centres, not from individual, short inference queries like our conversation.4
Would it cost me less to heat the milk in the microwave before pouring it into the coffee?