AI’s Dirty Secret: Why Tech Giants Are Building Gas Plants, and Why Your ESG Report Should Care

The artificial intelligence industry has an energy problem it would rather not talk about. As AI models grow larger and inference workloads multiply, data centers are consuming electricity at rates that would make a steel plant blush. The solution some companies have landed on? Build their own natural gas power plants.

This is not a fringe development. Multiple AI infrastructure providers are now investing directly in gas-fired generation capacity, bypassing an electrical grid that simply cannot keep pace with demand. For CIOs and CTOs in India evaluating cloud and AI partnerships, this creates a sustainability puzzle that extends far beyond your own office walls.

The Scale of the Energy Problem

Training a large language model can consume as much electricity as a small town uses in a year. But training is a one-time cost. The real drain comes from inference — every time someone asks ChatGPT a question or runs an image through a recognition model, servers spin up and consume power.

Global data center electricity consumption is projected to double by 2030, with AI workloads driving much of that growth. In markets like the United States, where many hyperscale facilities are located, grid capacity additions have not kept pace. The result: AI companies are taking power generation into their own hands.

Natural gas offers a pragmatic, if problematic, solution. Plants can be built relatively quickly, provide reliable baseload power, and cost less than equivalent renewable installations with battery storage. From a pure operational standpoint, the logic is sound.

Why This Matters for Your Carbon Footprint

Here is where it gets complicated for Indian enterprises. When your company uses a cloud AI service, the carbon emissions from that workload show up in your Scope 3 reporting — the category that covers indirect emissions from your supply chain and purchased services.

If your AI vendor is running inference on servers powered by natural gas, those emissions become part of your environmental footprint. This is true whether you are using the service from Mumbai or Minneapolis. The carbon does not care about geography.

Many technology vendors publish sustainability reports claiming carbon neutrality or renewable energy commitments. But these claims often rely on renewable energy certificates or carbon offsets rather than actual clean power at the point of consumption. A data center powered by gas but “offset” by wind credits in another state is still burning fossil fuels.

The Operational Risks Beyond Emissions

Carbon accounting is not the only concern. Companies that build their own power generation take on operational risks typically managed by utilities. Gas supply disruptions, maintenance failures, and regulatory changes all become direct business risks.

Fuel price volatility adds another variable. Natural gas prices can swing dramatically based on weather, geopolitics, and pipeline capacity. These costs eventually flow through to customers, making long-term AI infrastructure budgeting less predictable.

There is also regulatory risk to consider. Governments worldwide are tightening emissions standards and carbon pricing mechanisms. Infrastructure investments made today based on current regulations may face stranded asset problems if rules change. The European Union’s Carbon Border Adjustment Mechanism, for instance, could eventually affect the cost competitiveness of services powered by fossil fuels.

Questions to Ask Your Vendors Now

Indian enterprises cannot control how global AI providers power their facilities. But you can ask better questions and make more informed choices.

Start by requesting actual energy source breakdowns, not just renewable energy certificate percentages. Ask whether specific data center regions use on-site fossil fuel generation. Inquire about power purchase agreements and what happens when renewable supply falls short of demand.

Some vendors are more transparent than others. Google, for instance, publishes hourly carbon-free energy percentages for individual data center regions. This level of granularity lets you route workloads to cleaner facilities when flexibility exists.

Consider whether your AI workloads can tolerate latency. Running inference in a Nordic data center powered by hydroelectricity may add milliseconds but subtract tonnes of carbon. For batch processing jobs, the trade-off is often worth making.

What This Means for You

The AI industry’s dash toward gas-powered infrastructure exposes a tension that will only grow sharper. Enterprises face pressure to adopt AI for competitive reasons while simultaneously meeting sustainability targets that increasingly include Scope 3 emissions.

Do not assume your cloud vendor’s green branding tells the full story. Build energy transparency into your procurement criteria. Factor carbon costs into total cost of ownership calculations, especially as carbon pricing mechanisms mature.

The companies that navigate this well will treat AI infrastructure decisions as both technology and sustainability choices — because regulators, investors, and customers increasingly see them as the same thing.

Leave a Reply

Your email address will not be published. Required fields are marked *