Volver al blog

Síguenos y suscríbete

Sólo disponible en inglés

Por el momento, esta página solo está disponible en inglés. Lamentamos las molestias. Vuelva a visitar esta página más tarde.

Can We Make AI Green? Big AI Sustainability Questions, Answered by Fastly’s Co-Founder

Alina Lehtinen-Vela

Content Marketing Manager

Can we make AI green, or is the AI race fundamentally at odds with sustainability? In this article, we speak with Simon Wistow, Fastly’s Co-founder and VP of Strategic Initiatives, to tackle this and other tough AI sustainability questions.

We also crunch the numbers from our latest global survey, The 2025 AI Energy Pulse Check, which gathered insights from 497 professionals managing AI infrastructure and sustainability across the US, Europe, and APAC.

The bad news? We’ve still got a long way to go when it comes to tracking AI energy use and optimizing workloads. Luckily, we’ve already got the tools to make AI a lot more sustainable; we just need to use them.

Question 1: Is Gen AI Really That Bad for the Environment?

There is a hot debate about AI sustainability online, with some tech influencers claiming AI is not as bad for the environment as others claim. Simon says the extremes of the debate aren’t helpful.

"Some people have got this idea that no matter what the cost, AI is the future... Others think it’s a disaster. The truth is somewhere in between."

To tackle the AI energy problem, we need to admit it does have a real environmental cost. Gen AI isn’t just some magical cloud outputting poetry and code. As Simon puts it, it’s millions of GPUs crunching vectors and consuming serious power, and most people don’t even realize how much.

He’s not exaggerating. Our AI Energy Pulse Check 2025 data shows companies are not tracking their AI energy use fully. In fact, fewer than 1 in 10 companies track more than 75% of their AI energy usage. The lack of tracking is a problem, especially as the scale of AI use continues to grow. Even small AI queries are incredibly energy-intensive. As Simon says, "One of the numbers that’s been thrown out there is that a query against some sort of large language model is about ten times more energy than querying Google. And Google does an enormous amount of work per query."

Luckily, there are tools that can make AI queries more sustainable. One of those tools is Fastly’s AI Accelerator, which enables AI query caching. 

"As we've gotten more sophisticated... more people are realizing that there are things that you can do. You can work smarter, not harder." Simon says. 

Question 2: How Can the AI Industry Balance Innovation With Sustainability?

Innovation and sustainability aren’t mutually exclusive, but they require a shift in mindset, Simon says. 

“I think the way that we can balance innovation versus sustainability is really thinking about how we do things, and being really thoughtful about stuff,” he says. “Models take a lot of energy to train if they're these mega models... but maybe not everything needs a mega model like the giant ones that are trained on the entire internet and take hundreds of millions of dollars to go and train.”

Instead of defaulting to the biggest, more expensive models, Simon advocates for smarter engineering and better reuse.

“We might be able to balance out the energy usage. We can reuse the models to train other models as well.”

There is a huge potential in everyday infrastructure optimization, especially through AI query caching and shared effort, Simon says. Over half of our survey respondents estimate that 10–30% of their company’s AI queries are redundant. More than a quarter believe they could cut energy use by up to 50% through optimization alone. 

“If you can use a semantic cache so that even if you phrase a question a slightly different way, you can still cache it, then you're not doing [unnecessary] work. And not doing the work is the best way to not burn energy.”

He adds, "Lots of people are implementing the same sort of caching strategy that we released last year. More people are realizing that there are things you can do [to make AI more sustainable]."

Though query caching and workload optimizations are gaining traction, they’re still not widespread. Currently, only 14.7% of European respondents, 27.3% in APAC, and 33.5% in the US say they use caching, edge, or workload optimization extensively. Respondents cited the complexity of managing edge AI as the biggest barrier to implementing edge AI across regions.

Question 3: How Can We Make AI More Green?

Simon says efficiency can be engineered at many layers, from the model architecture to the infrastructure beneath it.

“We can work on research to make it so that large language models are more efficient to run. Maybe they're based on integers rather than floating point, and that will cut the energy usage down.”

He also floats the idea of shared or standardized models that companies or governments could chip in on, so we’re not all reinventing the wheel.

"Maybe we can reuse work between models... or have a standard common model funded by a consortium of companies or governments or science institutions."

Question 4: Where Do You See Sustainable AI Going in the Next 3–5 Years?

Simon offers a clear prediction:

"What's going to happen quite quickly is, as the cost of training models gets to the point where it's just ludicrous, it's necessarily going to make it so that people have to be a little bit smarter about the way that they train things."

In other words, economic forces will soon require smarter, leaner infrastructure, even without regulatory mandates.

Survey data backs this up. Nearly 45% of global respondents say they’d prioritize energy-efficient models if the cost of AI use was linked to energy consumption. Some APAC and US companies are also beginning to factor energy use into infrastructure decisions when choosing between edge and cloud deployments.

Rethinking AI’s Energy Future

The environmental costs of AI are real, but so are the solutions. As Simon puts it, we don’t need to pick between progress and sustainability; we just need to figure out how to make them work together. More transparency, shared infrastructure, and better engineering can cut the environmental cost of AI without slowing things down. 

Read the full 2025 AI Energy Pulse Check report to explore the complete data.

Disclaimer on the Survey Data

The insights shared from the 2025 AI Energy Pulse Check survey are intended to spark discussion, not serve as definitive industry benchmarks. While the survey was validated and covered a range of professionals across the US (315), Europe (116), and APAC (66), the regional sample sizes vary significantly. As with any self-reported data, responses may include some degree of bias or overstatement. Readers should consider these findings as directional rather than absolute.