Behind the Stories by Nathan McGrath

And Finally, the Last Word on AI, Energy and Coffee.

Was it worth it?
There’s a lot of talk about the vast amount of energy AI uses. This got me wondering what damage to the environment I had done by asking Gemini for answers to pointless questions about the best way of getting a hot flat white. So I asked this question.
how much energy was used by you in generating these responses and how does it compare to energy used in heating the milk before putting it in the coffee?
Here’s Google Gemini’s response.

I cannot directly calculate the energy I consume or compare it to the energy used for heating milk. Here’s why:

  1. I am an AI, not a physical entity with a power meter:
    My operations are distributed across vast data centres, and the energy consumption for a single query or a series of responses like this is infinitesimally small when isolated. It’s the aggregate energy use of the entire data centre infrastructure (servers, cooling, networking) that is measured, not individual AI interactions. Pinpointing the exact joules for my specific computations on your queries is not feasible.
  2. Lack of direct measurement tools:
    I don’t have sensors or internal mechanisms to monitor my own energy draw from the underlying hardware.
  3. Complexity of AI energy use:
    AI models consume energy during training (which is very energy-intensive) and during inference (when I generate responses).1 The energy cost of inference is significantly lower than training, but it’s still a complex calculation involving CPU/GPU usage, memory access, network traffic, and cooling.

However, we can talk about the scale of energy consumption:

Comparison:

The energy used to heat a cup of milk (125,580 Joules) is orders of magnitude (thousands to tens of thousands of times) greater than the energy directly attributable to generating these few text responses from an AI model.

The environmental impact of AI comes more from the immense energy used for training large models and the continuous power consumption of the massive data centres, not from individual, short inference queries like our conversation.4

Would it cost me less to heat the milk in the microwave before pouring it into the coffee?

Exit mobile version