Ita also trivial to come to the same conclusion at a smaller scale.
You can run a LLM at home and see the amount of GPU & power resources it takes to compute the larger models. If I ran that full time, your household bill will most likely be 3x alone.
“I felt a great disturbance in the Force, as if millions of voices suddenly cried out in terror and were suddenly silenced.”