

Maybe one day fusion will finally deliver and we might have cheap and clean energy with no consequences to the environment other than a few big reactors in a country. But until that day arrives and we work that out we have to transfer and Wind, Solar and batteries are winning because they are cheaper than gas, coal and nuclear.
Initially a lot of the AI was getting trained on lower class GPUs and none of these AI special cards/blades existed. The problem is that the problems are quite large and hence require a lot of VRAM to work on or you split it and pay enormous latency penalties going across the network. Putting it all into one giant package costs a lot more but it also performs a lot better, because AI is not an embarrassingly parallel problem that can be easily split across many GPUs without penalty. So the goal is often to reduce the number of GPUs you need to get a result quickly enough and it brings its own set of problems of power density in server racks.