At least that’s according to Alistair Croll’s post on Earth2Tech.
If I understand correctly, lousy computer coding increases processing time, which thus increases the amount of energy needed to run a particular application.
Well worth a read, and you can find it here.






