Yeah I guess I'm not the target audience for this because I assumed that "the power problem" was "massive increase in electricity costs for people despite virtually unchanged usage on their part", not "AI companies have to wait too long to be able to start using even more power than they already are":
> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.
> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”
I understand the instinct but if people seriously think that they are solving any problem by unplugging cell phone chargers, they are simply bad at math. Human time is easily worth more than that, even when working at minimum wage.
That said, it obviously sucks that utility prices are rising for people who can not effortlessly cover that (not to speak of the local pollution, if that's an issue). Maybe some special tax to offset that cost to society towards hyper scalers would be a reasonable way to soften the blow, but I have not done the math.
And the air quality around these plants is poor, leading to health problems for the neighbors.
This short term, destructive, thinking should be criminalized.
I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.
In the case of Grok's turbines, no emissions controls means sick people. Plus all the CO2 pushing climate collapse faster which hurts every coming generation.
Gas plants are not bad… but imagine 400 MW of gas plants in a concentrated area. You’ll always have NOx and SOx by products whenever you’re burning gas.
Gas is certainly less of a problem than coal, but they still produce plenty of bad stuff: nitrogen oxides and bad VOCs like formaldehyde that are well studied to increase risk of asthma and some types of cancer. I certainly wouldn’t want to live close to one.
The only way to solve problems like this IMO is to price in the externalities. Tax fossil fuels for the damage they do, in order to reveal their true cost. Then they will never look like the most affordable option, because they're not.
True. The same is true for nuclear energy. I never heard of a nuclear power plant that did not receive substantial subsidies throughout lifetime. Not to forget the nuclear fuel and the efforts required to create it and later to store it.
The natural gas turbines used are relatively efficient as far as engines go. Having them on-site makes transmission losses basically negligible.
Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
I'm honestly curious whether you yourself are even aware of the disingenuousness of this argument. It's fairly impressive in its density!
1. Nobody complained about the efficiency of natural gas turbines. You can efficiently do a lot of useless stuff with deep negative externalities, and the fact it's efficient is not all that helpful.
2. Saying "the extreme far end would not be satisfied even by much better solutions" is not an excuse not to pursue better solutions!
3. There are many dimensions of this that people care about beyond the "global concern" level regarding "pollution and fuel consumption."
4. There are many problems that are significant and worth thinking about even if they are not the largest singular problems that could be included by some arbitrarily defined criteria
> I'm honestly curious whether you yourself are even aware of the disingenuousness of this argument.
Unnecessarily condescending and smug, but I’ll try to respond.
That said, you’re putting forth your own disingenuous assumptions and misconceptions. The natural gas turbines are an intermediate solution to get up and running due to the extremely long and arduous process of getting connected to the grid.
Arguing pedantry about the word efficiency isn’t helpful either. The data centers are being built, sorry to anyone who gets triggered by that. The gas turbines are an efficient way to power them while waiting for grid interconnect and longterm renewables to come online.
Disingenuous is acting like this is a permanent solution to the exclusion of others. The whole point is that it gets them started now with portable generation that is efficient.
The gas turbines are hopefully an intermediate solution due to the long and not guaranteed process of grid connection and renewable buildout. History is of course full of such bets that did not work out the way their proponents hoped.
> The data centers are being built, sorry to anyone who gets triggered by that.
It's obvious that you're starting from your conclusion and working backwards, which is probably how your initial comment was full of so much motivated reasoning to begin with.
In your mind, is there any set of negative externalities that would justify not building the data centers, or at least not building them now, or at least not building them now in specific areas that require these types of interim solutions?
This is exactly right. These are glorified emergency generators, and grid power is ordinarily far cheaper; especially for interruptible loads like training new models (checkpointing work in progress and resuming it later is cheap and easy). The article mentions that quite clearly.
> as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
Same level doesn't remove the concern for this unnecessary pollution. Stop changing the subject from the environmental problems that AI usage can have by their increased power consumption.
Natural gas engines are efficient!
Ok! But what about the pollution they produce to nearby neighborhoods? What about the health repercussions? Do human lives not matter?
And imagine all this poorly located, overpriced, haphazardly thrown together and polluting infrastructure will basically get flushed down the toilet once either the AI bubble pops, or they figure out a new way of doing AI that doesn't require terawatts of power.
> This is a really long way of saying "We need to burn fossil fuels to make more money."
Like every other industry in the world?
I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.
I often like SemiAnalysis' work, but there's parts of this article that are shockingly under-researched and completely missing critical parts of the narrative.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.
> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!
The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.
The problem is that most of the AI labs are popping up in TX that has a uniquely isolated electrical grid. Recall how the Texas cold snap a few years ago took down the grid for days. Turns out if you make a grid based on short term profit motifs, it's not going to be flexible enough to take new demand.
It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it
Really cool in depth report, thanks for sharing. It's very interesting to see what these big datacenter deployments are actually doing. Go look at the oil price charts for the last 25 years and you'll see why it makes a ton of sense economically.
I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.
What about renewables + battery storage? Does it take much longer to build? I can imagine getting a permit can take quite a long time, but what takes so long to set up solar panels and link them to batteries, without even having to connect them to the grid?
How many batteries is that? If we're talking solar and you have say a 300MW datacenter and you need it to operate for 12 hours without sun you need at least two of the largest battery install in the world[1] at 1700MWh. That doesn't factor cloudy days.
Reciprocating natural gas engines can be moved from [concrete] pad to pad and be up and running in under 24 hours. The portable turbines take longer but they’re still fast.
Acquiring enough solar panels and battery storage still takes a very long time by comparison.
The density required for solar is also much lower - the coordination between different land parcels and routing power and getting easements increases the time required vs. on prem gas turbines.
Takes much longer to build, requires a much larger up-front investment, and requires a lot more land.
The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.
Part of what bothers me with AI energy consumption isn't just how wasteful it might be from an ecological perspective, it's how brutally inefficient it is compared to the biological "state of the art" — 2000kcal = 8,368 kJ. 8,368 kJ / 86,400 s = 96.9 W.
So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.
Is it though? When I ask an LLM research questions, it often answers in 20 seconds what it would take me an entire afternoon to figure out with traditional research.
Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.
Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.
If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.
For this kind of thinking to work in practice you would need to kill the people that AI makes redundant. This is apart from the fact that right now we are at a choke point where it's much more important to generate less CO2 than it is to write scientific simulation code a little quicker (and most people are using AI for much more unnecessary stuff like marketing)
How so? A human needs the entire civilisation to be productive at that level. If you take a just the entire US electricity consumption and divide it by its population, you'll get a result that's an order of magnitude higher. And that's just electricity. And that's just domestic consumption, even though US Americans consume tons of foreign-made goods.
Beyond wasteful the linked article can't even remotely be taken seriously.
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
And then we get this gem...
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.
This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.
I can generate images or get LLM answers in below 15 seconds on mundane hardware. The image generator draws many times faster than any normal person, and the LLM even on my consumer hardware still produces output faster than I can type (and I'm quite good at that), let alone think what to type.
Speed highly correlates with power efficiency. I believe my hardware maxes out somewhere around 150W. 15 seconds of that isn't much at all.
> Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?
I presume that's mostly training, not inference. But in general anything that serves millions of requests in a small footprint is going to look pretty big.
Boom’s pivot to trying to build turbines for data centers wasn’t surprising when data center deployments started using turbines. Either their CEO saw one of the headlines or their investors forwarded it over and it became their new talking point.
What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
Boom doesn’t actually have a turbine yet. Their design partner publicly pulled out of their contract with Boom a while ago.
Boom has been operating on vaporware for a while. It’s one of those companies I want to see succeed but whatever they’re doing in public is just PR right now. Until they actually produce something (other than a prototype that doesn’t resemble their production goals using other people’s parts) their PR releases don’t mean a whole lot.
> What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
My first thought when seeing that article is “I can buy one of these right now from Siemens or GE, and I could’ve ordered one at any time in the last 50 years.”
And all without the proper permits! Using 35 generators when they were only allowed 15! Yay! So glad we're allowing AI companies to break law after law after law to not be able to reason logically the basic Towers of Hanoi.
If you do the math, that's $10-$12 per watt year. There's approx 24×365.25=8766 hours in a year, so assuming that the datacenters would be running 24×7, that boils down to $1.14 to $1.37 in revenue per kWh. That's not a bad deal if power really is a major part of the expense.
This is coming from a group that does analysis on the semiconductor and cloud industries and provided very expensive access to their models and info. They are the citation.
So I guess it’s not a bubble then since these companies are raking in the big revenues? Or maybe they are counting all those circular investments as revenues somehow?
> However, AI infrastructure cannot wait for the grid’s multiyear transmission upgrades. An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually. Getting a 400 MW datacenter online even six months earlier is worth billions. Economic need dwarfs problems like an overloaded electric grid. The industry is already searching for new solutions.
wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.
This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive. xAI entirely bypassed the grid and generated power onsite, using truck-mounted gas turbines and engines.
Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.
I'm no fan of Musk, but you've got to admit it was a clever way to achieve the goal. SemiAnalysis don't do fanboy articles - their research is pretty in-depth. So they are stating it as they see it.
The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".
"Getting a permit for 15 turbines after having illegally used 35 turbines that then poisoned the air for the residences around the turbines" is a clever way to achieve the goal? I wouldn't call doing a blatant illegal action "clever", but rather sociopathic.
The dialog around AI resource use is frustratingly inane, because the benefits are never discussed in the same context.
LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:
> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
Natural Gas supply problem: worsened
Carbon in the atmosphere problem: worsened
> Nicole Pastore, who has lived in her large stone home near Baltimore’s Johns Hopkins University campus for 18 years, said her utility bills over the past year jumped by 50%. “You look at that and think, ‘Oh my god,’” she said. She has now become the kind of mom who walks around her home turning off lights and unplugging her daughter’s cellphone chargers.
> And because Pastore is a judge who rules on rental disputes in Baltimore City District Court, she regularly sees poor people struggling with their own power bills. “It’s utilities versus rent,” she said. “They want to stay in their home, but they also want to keep their lights on.”
https://www.bloomberg.com/graphics/2025-ai-data-centers-elec...
That said, it obviously sucks that utility prices are rising for people who can not effortlessly cover that (not to speak of the local pollution, if that's an issue). Maybe some special tax to offset that cost to society towards hyper scalers would be a reasonable way to soften the blow, but I have not done the math.
This short term, destructive, thinking should be criminalized.
I think it's time to discuss changing the incentives around ai deployment, specifically paying into a ubi fund whenever human jobs are replaced by ai. Musk himself raised the idea.
https://www.indexbox.io/blog/tech-leaders-push-for-universal...
Coal plants are bad.
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
Not so.
Nothing short of full solar connected to batteries produced without any difficult to mine elements will make some people happy, but as far as pollution and fuel consumption data centers aren’t really a global concern at the same level as things like transportation.
1. Nobody complained about the efficiency of natural gas turbines. You can efficiently do a lot of useless stuff with deep negative externalities, and the fact it's efficient is not all that helpful.
2. Saying "the extreme far end would not be satisfied even by much better solutions" is not an excuse not to pursue better solutions!
3. There are many dimensions of this that people care about beyond the "global concern" level regarding "pollution and fuel consumption."
4. There are many problems that are significant and worth thinking about even if they are not the largest singular problems that could be included by some arbitrarily defined criteria
Unnecessarily condescending and smug, but I’ll try to respond.
That said, you’re putting forth your own disingenuous assumptions and misconceptions. The natural gas turbines are an intermediate solution to get up and running due to the extremely long and arduous process of getting connected to the grid.
Arguing pedantry about the word efficiency isn’t helpful either. The data centers are being built, sorry to anyone who gets triggered by that. The gas turbines are an efficient way to power them while waiting for grid interconnect and longterm renewables to come online.
Disingenuous is acting like this is a permanent solution to the exclusion of others. The whole point is that it gets them started now with portable generation that is efficient.
> The data centers are being built, sorry to anyone who gets triggered by that.
It's obvious that you're starting from your conclusion and working backwards, which is probably how your initial comment was full of so much motivated reasoning to begin with.
In your mind, is there any set of negative externalities that would justify not building the data centers, or at least not building them now, or at least not building them now in specific areas that require these types of interim solutions?
Same level doesn't remove the concern for this unnecessary pollution. Stop changing the subject from the environmental problems that AI usage can have by their increased power consumption.
Natural gas engines are efficient!
Ok! But what about the pollution they produce to nearby neighborhoods? What about the health repercussions? Do human lives not matter?
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
It didn't make long-term sense for our world before AI. It makes no more sense with AI.
Like every other industry in the world?
I’m kind of amazed that AI data centers have become the political talking point for topics like water usage and energy use when they’re just doing what every other energy-intensive industry does. The food arriving at your grocery store and the building materials that built your house also came from industries that consume a lot of fossil fuels to make more money.
> Eighteen months ago, Elon Musk shocked the datacenter industry by building a 100,000-GPU cluster in four months. Multiple innovations enabled this incredible achievement, but the energy strategy was the most impressive.
> Again, clever firms like xAI have found remedies. Elon's AI Lab even pioneered a new site selection process - building at the border of two states to maximize the odds of getting a permit early!
The energy strategy was to completely and almost certainly illegally bypass permitting and ignore the Clean Air Act, at a tangible cost to the surrounding community by measurably increasing respiratory irritants like NOx in the air around these communities. Characterizing this harm as "clever" is wildly irresponsible, and it's wild that the word "illegal" doesn't appear in the article once, while at the same time handwaving the fact that permitting for local combustion-based generation (for these reasons!) is one of the main factors to pushing out timelines and increasing cost.
[1] https://time.com/7308925/elon-musk-memphis-ai-data-center/
[2] https://www.selc.org/news/resistance-against-elon-musks-xai-...
[3] https://naacp.org/articles/elon-musks-xai-threatened-lawsuit...
*greed.
We are well past the point that any economic growth at all is anything but a distribution of income problem.
It's not the grid's technological limitation. We could have lived in a world with a more connected grid, more nibble utility commissions, and a lot less methane/carbon emissions as a result of it
I also love how you can see the physical evidence of them pitting jurisdictions against each other from the satellite photos with the data center on one side of a state border and the power generation on the other.
[1] https://www.heise.de/en/news/850-MW-World-s-largest-battery-...
Acquiring enough solar panels and battery storage still takes a very long time by comparison.
The footprint needed when trying to generate this much power from solar or wind necessitates large-scale land acquisition plus the transmission infrastructure to get all that power to the actual data center, since you won't usually have enough land directly adjacent to it. That plus all the battery infrastructure makes it a non-starter for projects where short timescales are key.
This seems like a big reach for me. Their largest engine (and it is absolutely massive) "only" produces 80MW of power. The Brayton cycle is unbeatable if you need to keep scaling power up to ridiculous levels.
So the benchmark is achieving human-like intelligence on a 100W budget. I'd be very curious to see what can be achieved by AI targeting that power budget.
Similarly, I've had times where it wrote me scientific simulation code that would take me 2 days, in around a minute.
Obviously I'm cherry-picking the best examples, but I would guess that overall, the energy usage my LLM queries have required is vastly less than my own biological energy usage if I did the equivalent work on my own. Plus it's not just the energy to run my body -- it's the energy to house me, heat my home, transport my groceries, and so forth. People have way more energy needs than just the kilocalories that fuel them.
If you're using AI productively, I assume it's already much more energy-efficient than the energy footprint of a human for the same amount of work.
In that case I think it would be only fair to also count the energy required for training the LLM.
LLMs are far ahead of humans in terms of the sheer amount of knowledge they can remember, but nowhere close in terms of general intelligence.
A computer uses orders of magnitude less energy than a human.
It's all about the task, humans are specialized too.
EDIT: maybe add a logarithm or other non-linear functions to make the gap even bigger.
> An AI cloud can generate revenue of $10-12 billion dollars per gigawatt, annually.
What? I let ChatGPT swag an answer on the revenue forecast and it cited $2-6B rev per GW year.
And then we get this gem...
> Wärtsilä, historically a ship engine manufacturer, realized the same engines that power cruise ships can power large AI clusters. It has already signed 800MW of US datacenter contracts.
So now we're going to be spewing ~486 g CO₂e per kWh using something that wasn't designed to run 24/7/365 to handle these workloads? These datacenters choosing to use these forms of power should have to secure a local vote showcasing, and being held to, annual measurements of NOx, CO, VOC and PM.
This article just showcases all the horrible bandaids being applied to procure energy in any way possible with little regard to health or environmental impact.
This article is coming from one of the premier groups doing financial and technical analysis on the semiconductor industry and AI companies.
I trust their numbers a hundred times more than a ChatGPT guess.
But either way, how many human lives are spent making that file?
I can generate images or get LLM answers in below 15 seconds on mundane hardware. The image generator draws many times faster than any normal person, and the LLM even on my consumer hardware still produces output faster than I can type (and I'm quite good at that), let alone think what to type.
Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?
> Also, why are people moving mountains to make huge, power obliterating datacenters if actually "its fine, its not that much"?
I presume that's mostly training, not inference. But in general anything that serves millions of requests in a small footprint is going to look pretty big.
Great analogy.
https://qz.com/boom-supersonic-jet-startup-ai-data-center-po...
What is interesting is how many people saw the Boom announcement and came to believe that Boom was a pioneer of this idea. They’re actually a me-too that won’t have anything ready for a long time, if they can even pull it off at all.
Boom has been operating on vaporware for a while. It’s one of those companies I want to see succeed but whatever they’re doing in public is just PR right now. Until they actually produce something (other than a prototype that doesn’t resemble their production goals using other people’s parts) their PR releases don’t mean a whole lot.
My first thought when seeing that article is “I can buy one of these right now from Siemens or GE, and I could’ve ordered one at any time in the last 50 years.”
So they solved the power problem by consuming more fossil fuel. Got it.
https://techcrunch.com/2025/07/03/xai-gets-permits-for-15-na...
That said, it is all pretty impressive.
Citation needed.
I think that's most people's assumption. It's not that AI is worthless, but that it's significantly less valuable than investors are betting on.
wow, that's some logic. Environmentally unsound means of extracting energy directly damage the ecosystem in which humans need to live. The need for a functioning ecosystem "dwarfs" "problems" like billionaires not making enough billions. Fixing a ruined ecosystem would cost many more billions than whatever economic revenue the AI generated while ruining it. So if you're not harnessing the sun or wind (forget about the latter in the US right now, btw), you're burning things, and you can get lost with that.
This kind of short sighted thinking is because when folks like this talk about generating billions of dollars of worth, their cerebellums are firing up as they think of themselves personally as billionaires, corrupting their overall thought processes. We really need to tax billionaires out of existence.
Wow, "truck-mounted gas turbines"? Who else could have mastered such a futuristic tech in so short a time? Seriously, who wrote this? Grok? And let's ignore that this needless burning of fossil fuel is making life on Earth harder for everyone and everything else.
The problem ordinary people all over the world have is that governments are allowing this to happen. Maybe if there were stricter regulation it will prevent players such as Musk to come up with such "innovations".
https://techcrunch.com/2025/07/03/xai-gets-permits-for-15-na...
https://www.politico.com/news/2025/05/06/elon-musk-xai-memph...
LLMs/diffusers are inefficient from a traditional computing perspective, but they are also the most efficient technology humanity has created:
> AI systems (ChatGPT, BLOOM, DALL-E2, Midjourney) and human individuals performing equivalent writing and illustrating tasks. Our findings reveal that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
Source: https://www.nature.com/articles/s41598-024-54271-x