Skip Navigation
9 comments
  • openAI could just stop training worse models for a bit

    17
  • The costs are significant and growing but we should put some things into perspective to really tackle the problem efficiently. As an individual, heavy usage of these tools (something like 1000 images generated) is still roughly the same level of emissions as driving across town and generating text is pretty much negligible in all scenarios.

    Where we really need to be concerned is video generation (which could easily blow current energy usage out of the water) and water usage in these massive data centers. However, most of the current research on the subject does a pretty poor job of separating water usage for "AI" and general usage. This is why the next step is enforcing transparency so we can get a picture of how things are shaping up as this technology develops.

    All that said, there are some pretty low hanging fruit when it comes to improving efficiency. A lot of these models are essentially first-passes on a project and efficiency will improve simply as they start to target edge and local models. Similarly, these water cooling systems are predicated on some fairly wasteful ideas, namely that cool fresh water is abundant and does not warrant preservation. Simply factoring in that this is clearly no longer the case will go a long way towards reducing that usage.

    16
    • To address the article a little more directly: it's notable that the article begins with Sam Altman's take on the subject. His feelings are based on two fundamentally flawed premises:

      1. These models MUST get bigger for the improvements that their users DEMAND.
      2. The only solution to any environmental criticism is FUSION. A technology that Altman has personally invested in.

      2 is ridiculous just on the face of it, but I think folks may have a harder time understanding why 1 is problematic. It is true that OpenAIs business model essentializes the idea that these models can't ever be run locally, but the incentive to use their cloud services are quickly diminishing as smaller, local models catch up. This cycle will likely continue until local models are good enough to serve the needs of the vast majority of people, especially as specialized hardware makes it's way into more and more consumer devices.

      11
  • one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

    And it’s not just energy. Generative AI systems need enormous amounts of fresh water to cool their processors and generate electricity. In West Des Moines, Iowa, a giant data-centre cluster serves OpenAI’s most advanced model, GPT-4. A lawsuit by local residents revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use — increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports. One preprint suggests that, globally, the demand for water for AI could be half that of the United Kingdom by 2027.

    6
  • Increasing energy use, or at least increasing the amount of usable energy we consume is always going to increase.

    It's not about stopping that it's about making sure renewables increases a lot faster than that.

    Same amount of energy as 33,000 for a tool like this. That peanuts.

    The real problem no one wants to talk about is population growth. We need less people who are going to use more energy and have better standards of living. But the world's probably fucking doomed anyway.

    0
You've viewed 9 comments.