Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Microblog Memes
  3. Save The Planet

Save The Planet

Scheduled Pinned Locked Moved Microblog Memes
microblogmemes
305 Posts 145 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • J [email protected]

    Are you interpreting my statement as being in favour of training AIs?

    P This user is from outside of this forum
    P This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #249

    I'm interpreting your statement as "the damage is done so we might as well use it"
    And I'm saying that using it causes them to train more AIs, which causes more damage.

    J 1 Reply Last reply
    0
    • P [email protected]

      I'm interpreting your statement as "the damage is done so we might as well use it"
      And I'm saying that using it causes them to train more AIs, which causes more damage.

      J This user is from outside of this forum
      J This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #250

      I agree with your second statement. You have misunderstood me. I am not saying the damage is done so we might as well use it. I am saying people don't understand that it is the training of AIs which is directly power-draining.

      I don't understand why you think that my observation people are ignorant about how AIs work is somehow an endorsement that we should use AIs.

      P 1 Reply Last reply
      0
      • sabrew4k3@lazysoci.alS [email protected]
        This post did not contain any content.
        S This user is from outside of this forum
        S This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #251

        I wish for cloud genAI services to be pay per use in at least the electricity cost per run. It would turn the overblown hype over ChatGPT clones into a question of if it's really worth the operating cost. Right now it's all VC funded.

        1 Reply Last reply
        2
        • J [email protected]

          There's no functional difference aside from usage and scale, which is my point.

          I find it interesting that the only actual energy calculations I see from researchers is the training and the things going along with the training, rather then the usage per actual request after training.

          People then conflate training energy costs to normal usage cost without data to back it up. I don't have the data either but I do have what I can do/see on my side.

          P This user is from outside of this forum
          P This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #252

          I'm not sure that's true, if you look up things like "tokens per kwh" or "tokens per second per watt" you'll get results of people measuring their power usage while running specific models in specific hardware. This is mainly for consumer hardware since it's people looking to run their own AI servers who are posting about it, but it sets an upper bound.

          The AI providers are right lipped about how much energy they use for inference and how many tokens they complete per hour.

          You can also infer a bit by doing things like looking up the power usage of a 4090, and then looking at the tokens per second perf someone is getting from a particular model on a 4090 (people love posting their token per second performance every time a new model comes out), and extrapolate that.

          1 Reply Last reply
          0
          • J [email protected]

            I agree with your second statement. You have misunderstood me. I am not saying the damage is done so we might as well use it. I am saying people don't understand that it is the training of AIs which is directly power-draining.

            I don't understand why you think that my observation people are ignorant about how AIs work is somehow an endorsement that we should use AIs.

            P This user is from outside of this forum
            P This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #253

            I guess.

            It still smells like an apologist argument to be like "yeah but using it doesn't actually use a lot of power".

            I'm actually not really sure I believe that argument either, through. I'm pretty sure that inference is hella expensive. When people talk about training, they don't talk about the cost to train on a single input, they talk about the cost for the entire training. So why are we talking about the cost to infer on a single input?
            What's the cost of running training, per hour? What's the cost of inference, per hour, on a similarly sized inference farm, running at maximum capacity?

            J 1 Reply Last reply
            1
            • merc@sh.itjust.worksM [email protected]

              And when it did it also altered the results, making them worse, because it was trying to satisfy "fuck" as part of your search.

              P This user is from outside of this forum
              P This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #254

              Well fuck...

              1 Reply Last reply
              1
              • merc@sh.itjust.worksM [email protected]

                Worse is Google that insists on shoving a terrible AI-based result in your face every time you do a search, with no way to turn it off.

                I'm not telling these systems to generate images of cow-like girls, but I'm getting AI shoved in my face all the time whether I want it or not. (I don't).

                5 This user is from outside of this forum
                5 This user is from outside of this forum
                [email protected]
                wrote on last edited by [email protected]
                #255

                Someone posted here a while ago - if you use the URL https://www.google.com/search?q=%25s&udm=14 it doesn't include the AI search. I've updated my Google search links to use that instead of the base Google URL.

                eyedust@lemmy.dbzer0.comE 1 Reply Last reply
                3
                • sabrew4k3@lazysoci.alS [email protected]
                  This post did not contain any content.
                  J This user is from outside of this forum
                  J This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #256

                  Meanwhile I'm down town I'm my city cleaning windows in office buildings that are 75% empty but the heat or ac is blasting on completely empty floors and most of the lights are on.

                  S 1 Reply Last reply
                  51
                  • M [email protected]

                    It's actually because small trucks were regulated out of the US market. Smaller vehicles have more stringent mileage standards that trucks aren't able to meet. That forces companies to make all their trucks bigger, because bigger vehicles are held to a different standard.

                    So the people who want or need a truck are pushed to buy a larger one.

                    5 This user is from outside of this forum
                    5 This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #257

                    They can meet them. But the profit margin is slimmer than if they use the giant frame.

                    1 Reply Last reply
                    2
                    • P [email protected]

                      I guess.

                      It still smells like an apologist argument to be like "yeah but using it doesn't actually use a lot of power".

                      I'm actually not really sure I believe that argument either, through. I'm pretty sure that inference is hella expensive. When people talk about training, they don't talk about the cost to train on a single input, they talk about the cost for the entire training. So why are we talking about the cost to infer on a single input?
                      What's the cost of running training, per hour? What's the cost of inference, per hour, on a similarly sized inference farm, running at maximum capacity?

                      J This user is from outside of this forum
                      J This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #258

                      Maybe you should stop smelling text and try reading it instead. 😛

                      Running an LLM in deployment can be done locally on one's machine, on a single GPU, and in this case is like playing a video game for under a minute. OpenAI models are larger than by a factor of 10 or more, so it's maybe like playing a video game for 15 minutes (obviously varies based on the response to the query.)

                      It makes sense to measure deployment usage marginally based on its queries for the same reason it makes sense to measure the environmental impact of a car in terms of hours or miles driven. There's no natural way to do this for training though. You could divide training by the number of queries, to amortize it across its actual usage, which would make it seem significantly cheaper, but it comes with the unintuitive property that this amortization weight goes down as more queries are made, so it's unclear exactly how much of the cost of training should be assigned to a given query. It might make more sense to talk in terms of expected number of total queries during the lifetime deployment of a model.

                      P 1 Reply Last reply
                      0
                      • 5 [email protected]

                        Someone posted here a while ago - if you use the URL https://www.google.com/search?q=%25s&udm=14 it doesn't include the AI search. I've updated my Google search links to use that instead of the base Google URL.

                        eyedust@lemmy.dbzer0.comE This user is from outside of this forum
                        eyedust@lemmy.dbzer0.comE This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #259

                        You can also use alternatives like startpage and ecosia which use google results, I believe.

                        A 1 Reply Last reply
                        8
                        • sabrew4k3@lazysoci.alS [email protected]
                          This post did not contain any content.
                          L This user is from outside of this forum
                          L This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #260

                          Laughs in total recall

                          1 Reply Last reply
                          11
                          • track_shovel@slrpnk.netT [email protected]

                            I'm really OOTL when it comes to AI GHG impact. How is it any worse than crypto farms, or streaming services?

                            How do their outputs stack up to traditional emitters like Ag and industry? I need a measuring stick

                            J This user is from outside of this forum
                            J This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #261

                            How is it any worse than crypto farms, or streaming services?

                            These two things are so different.

                            Streaming services are extremely efficient; they tend to be encode-once and decode-on-user's-device. Video was for a long time considered a tough thing to serve, so engineers put tons of effort into making it efficient.

                            Crypto currency is literally designed to be as wasteful as possible while still being feasible. "Proof-of-work" (how Bitcoin and many other currencies operate) literally means that crypto mining algorithms must waste as much computation as they can get away with doing pointless operations just to say they tried. It's an abomination.

                            track_shovel@slrpnk.netT 1 Reply Last reply
                            1
                            • J [email protected]

                              Maybe you should stop smelling text and try reading it instead. 😛

                              Running an LLM in deployment can be done locally on one's machine, on a single GPU, and in this case is like playing a video game for under a minute. OpenAI models are larger than by a factor of 10 or more, so it's maybe like playing a video game for 15 minutes (obviously varies based on the response to the query.)

                              It makes sense to measure deployment usage marginally based on its queries for the same reason it makes sense to measure the environmental impact of a car in terms of hours or miles driven. There's no natural way to do this for training though. You could divide training by the number of queries, to amortize it across its actual usage, which would make it seem significantly cheaper, but it comes with the unintuitive property that this amortization weight goes down as more queries are made, so it's unclear exactly how much of the cost of training should be assigned to a given query. It might make more sense to talk in terms of expected number of total queries during the lifetime deployment of a model.

                              P This user is from outside of this forum
                              P This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #262

                              You're way overcomplicating how it could be done. The argument is that training takes more energy:

                              Typically if you have a single cost associated with a service, then you amortize that cost over the life of the service: so you take the total energy consumption of training and divide it by the total number of user-hours spent doing inference, and compare that to the cost of a single user running inference for an hour (which they can estimate by the number of user-hours in an hour divided by their global inference energy consumption for that hour).

                              If these are "apples to orange" comparisons, then why do people defending AI usage (and you) keep making the comparison?

                              But even if it was true that training is significantly more expensive that inference, or that they're inherently incomparable, that doesn't actually change the underlying observation that inference is still quite energy intensive, and the implicit value statement that the energy spent isn't worth the affect on society

                              J 1 Reply Last reply
                              0
                              • A [email protected]

                                I'm not really saying that the curve itself is changing (sorry, I was really not clear), only that those other variables reduce actual energy demand later in the day because of the efficiency gains and thermal banking that happens during the peak energy production. The overproduction during max solar hours is still a problem. Even if the utility doesn't have a way of banking the extra supply, individual customers can do it themselves at a smaller scale, even if just by over-cooling their homes to reduce their demand after sundown.

                                Overall, the problem of the duck curve isn't as much about maxing out the grid, it's about the utility not having instantaneous power availability when the sun suddenly goes down. For people like me who work from home and have the flexibility to keep my home cool enough to need less cooling in the evening, having solar power means I can take advantage of that free energy and bank it to reduce my demand in the evening.

                                I get what you were saying now, but having solar would absolutely reduce my demand during peak hours.

                                I This user is from outside of this forum
                                I This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #263

                                It's a neat idea to over-cool in order to reduce consumption later on!

                                1 Reply Last reply
                                0
                                • jjmoldy@lemmy.worldJ [email protected]

                                  I am trying to understand what Google's motivation for this even is. Surely it is not profitable to be replacing their existing, highly lucrative product with an inferior alternative that eats up way more power?

                                  W This user is from outside of this forum
                                  W This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #264

                                  To make search more lucrative, they've enshitified it and went too far, but for a short time there were great quarterly resukts. Now they're slowly losing users. So they try AI to fix it up.

                                  It's also a signal to the shareholders that they're implementing the latest buzzword, plus they're all worried AI will take off and they've missed that train.

                                  1 Reply Last reply
                                  2
                                  • merc@sh.itjust.worksM [email protected]

                                    And when it did it also altered the results, making them worse, because it was trying to satisfy "fuck" as part of your search.

                                    E This user is from outside of this forum
                                    E This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by [email protected]
                                    #265

                                    If you can't search for "fuck," you can't search for "fuck google."

                                    With apologies to Lenny Bruce.

                                    1 Reply Last reply
                                    1
                                    • P [email protected]

                                      When I’m told there’s power issues and to conserve power I drop my AC to 60 and leave all my lights on. Only way for them to fix the grid is to break it.

                                      G This user is from outside of this forum
                                      G This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #266

                                      Literally rolling coal to own the cons

                                      1 Reply Last reply
                                      14
                                      • S [email protected]

                                        I don’t disagree with you but most of the energy that people complain about AI using is used to train the models, not use them. Once they are trained it is fast to get what you need out of it, but making the next version takes a long time.

                                        K This user is from outside of this forum
                                        K This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #267

                                        This is a specious argument.

                                        Once a model has been trained once they don't just stop training. They refine and/or start training new models. Showing demand for these models is what has encouraged construction on 100s of new datacenters.

                                        1 Reply Last reply
                                        0
                                        • I [email protected]

                                          Also they can build nuclear power generators for the data centers but never for the residential power grid.

                                          K This user is from outside of this forum
                                          K This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #268

                                          There's no money in selling residential energy.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups