Towards understanding the invisible impacts of AI
Have you ever wondered how much electricity and water your team consumes when using AI tools—particularly large language models (LLMs) like ChatGPT—on a daily basis?
We have. And we also wondered what the overall consumption might look like across the Belgian business landscape if everyone used these tools the way we do. Because even small changes in how we use AI—more queries, heavier files, more frequent usage—can quickly scale and generate real impacts, both positive and negative, on the environment at a national level and beyond.
Measuring internal usage: query volume and associated consumption
We measured the number of queries we sent to ChatGPT over the course of about one month. On average, a BetterBusiness employee makes 600 queries per month, or 30 queries per working day, for tasks such as email follow-ups, translations, regulatory summaries, and of course text generation. At our scale, this represents a consumption of 0.18 kWh per month¹—enough energy to power a standard 9W LED bulb for 20 hours per month, or about 1 hour per working day.
We were also curious about the direct water consumption associated with these queries, which amounts to approximately 1.34 L¹ per month. Translated into beers, that’s enough for a typically Belgian Friday evening.
Scaling up: extrapolation to Belgian companies
These figures may seem low, but we are not the only company using LLMs. Data shows that 12.7%² of Belgian companies currently use LLMs in the workplace. We therefore wanted to estimate total electricity and water consumption if our usage of ChatGPT were replicated across these companies—and the numbers quickly become much more significant. In 2025, Belgium had 1.2 million companies³, with around 4.2 million employees⁴—an average of 3.5 employees per company. Based on this, we estimate that electricity consumption related to LLMs in the business sector would reach 96.2 MWh, equivalent to powering 330 households per month (based on an average home outside Brussels)⁵, or more than 10 million hours of lighting with a LED bulb.
As for water consumption, the previous figures only account for water used directly to cool data centers. This represents approximately 718,000 liters, equivalent to the monthly consumption of 113 average Belgian households⁶⁷.
Comparative analysis: LLM queries vs traditional web searches
What if we used a traditional search engine instead? An LLM query consumes 10 times more than a standard web search⁸. Instead of powering 330 households per month, it would only power 33.
When it comes to water, things become more complex. There is no clear public data on direct water consumption per query for a standard web search. In other words, we have not found a precise figure indicating how much water is used to cool servers for a single Google search.To ensure a fair comparison, we therefore used indirect water consumption linked to electricity production, referred to as “water in energy.” This makes it possible to compare both technologies on a common basis: the amount of water associated with the electricity required to process a query.
Some estimates still provide an order of magnitude. A United Nations study⁹ estimates that a Google search consumes around 0.5 mL of “water in energy,” while ChatGPT would consume around 500 mL for 5 to 50 queries.For our calculations, we use an assumption of 18 mL per LLM query. By extrapolating BetterBusiness usage to all Belgian companies, indirect water consumption could reach 5.7 million liters for LLMs, compared to around 160,000 liters for the same number of queries via a search engine.
In other words, in this scenario, the indirect water footprint is approximately 36 times higher for LLM queries than for web searches. The reason is simple: an LLM query requires much more computing power. A search engine simply retrieves indexed information, while an LLM generates text through a complex neural network. More computation means more electricity—and therefore more water associated with its production.
Recommendations: towards more intentional and efficient use of LLMs
One approach could be to wait for LLMs to become more efficient and less resource-intensive. Another is to deliberately choose lighter, less energy-intensive models when possible. Or, in some cases, to avoid using AI altogether if that remains realistic in a productivity-driven world.At BetterBusiness, we chose not to wait. We implemented a simple internal usage framework. Employees are encouraged to use LLMs when there is a clear productivity gain or real time constraint—not by default. We also created dedicated GPTs with structured system prompts to generate more relevant responses from the start. This reduces unnecessary back-and-forth and therefore the number of queries. Fewer iterations mean less computation.
We even worked on writing style. Shorter, clearer prompts generate shorter and more efficient responses. Our teams were sometimes a bit too polite and verbose with AI—which is charming, but leads to more text and therefore more processing. Being precise, direct, and structured is more efficient—for both humans and models.We also introduced practical guidelines, particularly around “effective prompting,” prioritizing clarity over excessive experimentation. Finally, we monitor usage transparently to maintain awareness within our culture—not to slow down innovation, but to encourage conscious and responsible use.
None of these actions completely eliminate the impact. But they help shift from automatic usage to intentional usage—and at scale, that difference matters.
Conclusion: balancing usage and value creation
AI adoption will only continue to grow in the coming years. This tool—accessible, versatile, and time-saving—is simply too useful not to be used.This article does not even account for private, educational, or many other uses of AI already embedded in our daily lives—meaning these figures are still far from the full picture.
If consumption continues to grow at this pace, the efforts required to offset it will not be negligible. This will have implications for environmental and social balances and could exacerbate certain inequalities. The costs and “benefits” of AI will continue to rise, without everyone necessarily being compensated for this shift. Some may end up paying—directly or indirectly—for a technology that primarily benefits companies and organizations.
Of course, one could argue that AI enables better work, faster execution, and positive outcomes. But the more accessible a tool becomes, the more it is used—even when unnecessary. Overuse may ultimately cancel out the expected benefits, especially in a context where many organizations are already striving to reduce inequalities.
Yes, we are more productive than ever. But is it worth it?
The real question may not be “how much more can we do thanks to AI,” but rather “how much truly useful progress do we produce per unit of AI consumed?”
An equation worth exploring—and likely one that will become increasingly difficult to ignore.
References:
Jegham, N.,Abdelatti, M., Koh, C. Y., Elmoubarki, L., & Hendawi, A. (2025). How hungry is ai? benchmarking energy, water, and carbon footprint of llm inference. arXiv preprint arXiv:2505.09598.
https://www.acerta.be/en/inspiration/using-artificial-intelligence-in-the-workplace-belgian-companies-in-top-3-in-europe
https://www.belganewsagency.eu/over-113000-new-companies-established-in-belgium-last-year
https://www.onss.be/actualites/2025/09/01/barometre-du-travail-salarie—juin-2025
https://www.energuide.be/en/questions-answers/what-is-the-average-electricity-and-gas-consumption-in-the-brussels-region/273
https://statbel.fgov.be/en/news/there-were-5163139-private-households-belgium-1-january-2024
De Vries, A. (2023). The growing energy footprint of artificial intelligence. Joule, 7(10), 2191-2194.
Gupta, J., Bosch, H., & van Vliet, L. (2024). AI’s excessive water consumption threatens to drown out its environmental contributions. The Conversation.

