Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Microblog Memes
  3. Save The Planet

Save The Planet

Scheduled Pinned Locked Moved Microblog Memes
microblogmemes
305 Posts 145 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • F [email protected]

    It's closer to running 8 high-end video games at once. Sure, from a scale perspective it's further removed from training, but it's still fairly expensive.

    J This user is from outside of this forum
    J This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #138

    really depends. You can locally host an LLM on a typical gaming computer.

    F T C F 4 Replies Last reply
    4
    • S [email protected]

      You underestimate the number of people you wouldn't class as intelligent. If no one wanted massive trucks, they would have disappeared off the market within a couple of years because they wouldn't sell. They're ridiculous, inefficient hulks that basically no one really needs but they sell, so they continue being made.

      M This user is from outside of this forum
      M This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #139

      It's actually because small trucks were regulated out of the US market. Smaller vehicles have more stringent mileage standards that trucks aren't able to meet. That forces companies to make all their trucks bigger, because bigger vehicles are held to a different standard.

      So the people who want or need a truck are pushed to buy a larger one.

      5 1 Reply Last reply
      8
      • A [email protected]

        This curve has changed somewhat since this study in 2016. More efficient home insulation, remote working, and energy-efficient cooling systems have large impact in this pattern. But assuming you have a well-insulated home, setting your thermostat to maintain a consistent temperature throughout the day will shift this peak earlier and lower the peak load at sunset, when many people are returning home. More efficient heat pumps with variable pressure capabilities also helps this a lot, too.

        Given just how many variables are involved, it's better to assume peak cooling load to be mid-day and work toward equalizing that curve, rather than reacting to transient patterns that are subject to changes in customer behavior. Solar installations are just one aspect of this mitigation strategy, along with energy storage, energy-efficient cooling systems, and more efficient insulation and solar heat gain mitigation strategies.

        If we're discussing infrastructure improvements we might as well discuss home efficiency improvements as well.

        I This user is from outside of this forum
        I This user is from outside of this forum
        [email protected]
        wrote on last edited by [email protected]
        #140

        Do you have a source for the cooling off effect of the duck curve?

        Following is a 2 year old article hinting an increase in the effect https://www.powermag.com/epri-head-duck-curve-now-looks-like-a-canyon/ afaik it hasn't changed much but I'm open to news

        A 1 Reply Last reply
        1
        • C [email protected]

          79 is like my ideal temp. Cities must love me.

          M This user is from outside of this forum
          M This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #141

          My parents would love you.

          I don't even want to go to their house in the summer. I can't even think at that temperature.

          1 Reply Last reply
          0
          • D [email protected]

            Im not sure what's the point here? If we dont like LLMs and data centers using power then we use existing strategies that work like taxing their power use and subsidizing household power use which btw we're already doing almost everywhere around the world in some form or another.

            The data centers are actually easier to negotiate and work with than something like factories or households where energy margins are much more brittle. Datacenter employs like 5 people and you can squeeze with policy to match social expectations - you can't do that with factories or households. So datacenter energy problem is not that difficult relatively speaking.

            internetcitizen2@lemmy.worldI This user is from outside of this forum
            internetcitizen2@lemmy.worldI This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #142

            I am agreeing with you that the solutions exist, but the will to implement them is going to be the hard part. A big dampener is simply going to be the profit motive. There is more money in siding with the data center than a the households. Are households okay with an increasing in price? Data center is likely to manage that better, or even just pay a bribe to someone. I used food as another example of a problem that is solved. We can grow food without fail and build the rail to get it to where it needs. We just don't because need does not match profit expectation. There are talks of building nuclear power for some data centers, but such talk would not happen for normal households.

            D 1 Reply Last reply
            1
            • J [email protected]

              I know she's exaggerating but this post yet again underscores how nobody understands that it is training AI which is computationally expensive. Deployment of an AI model is a comparable power draw to running a high-end videogame. How can people hope to fight back against things they don't understand?

              C This user is from outside of this forum
              C This user is from outside of this forum
              [email protected]
              wrote on last edited by [email protected]
              #143

              She's not exaggerating, if anything she's undercounting the number of tits.

              M 1 Reply Last reply
              24
              • J [email protected]

                really depends. You can locally host an LLM on a typical gaming computer.

                F This user is from outside of this forum
                F This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #144

                You can, but that's not the kind of LLM the meme is talking about. It's about the big LLMs hosted by large companies.

                1 Reply Last reply
                5
                • J [email protected]

                  really depends. You can locally host an LLM on a typical gaming computer.

                  T This user is from outside of this forum
                  T This user is from outside of this forum
                  [email protected]
                  wrote on last edited by [email protected]
                  #145

                  Well that's sort of half right. Yes you can run the smaller models locally, but usually it's the bigger models that we want to use. It would also be very slow on a typical gaming computer and even a high end gaming computer. To make it go faster not only is the hardware used in datacenters more optimised for the task, it's also a lot faster. This is both a speed increase per unit as well as more units being used than you would normally find in a gaming PC.

                  Now these things aren't magic, the basic technology is the same, so where does the speed come from? The answer is raw power, these things run insane amounts of power through them, with specialised cooling systems to keep them cool. This comes at the cost of efficiency.

                  So whilst running a model is much cheaper compared to training a model, it is far from free. And whilst you can run a smaller model on your home PC, it isn't directly comparable to how it's used in the datacenter. So the use of AI is still very power hungry, even when not counting the training.

                  1 Reply Last reply
                  1
                  • J [email protected]

                    I know she's exaggerating but this post yet again underscores how nobody understands that it is training AI which is computationally expensive. Deployment of an AI model is a comparable power draw to running a high-end videogame. How can people hope to fight back against things they don't understand?

                    medicpigbabysaver@lemmy.worldM This user is from outside of this forum
                    medicpigbabysaver@lemmy.worldM This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #146

                    How about, fuck AI, end story.

                    J W 2 Replies Last reply
                    2
                    • J [email protected]

                      really depends. You can locally host an LLM on a typical gaming computer.

                      C This user is from outside of this forum
                      C This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #147

                      Yeh but those local models are usually pretty underpowered compared to the ones that run via online services, and are still more demanding than any game.

                      1 Reply Last reply
                      2
                      • I [email protected]

                        Do you have any data to support this is actually the case? I see this all the time but absolutely zero evidence but a 2015 Axios survey with no methodology or dataset. Nearly every article cites this one industry group with 3 questions that clearly aren't exclusive categorical and could be picked apart by a high school student.

                        I ask this question nearly every time I see this comment and in 5 years I have not found a single person who can actually cite where this came from or a complete explanation of even hope they got to that conclusion.

                        The truck owners I know, myself included, use them all the time for towing and like the added utility having the bed as as secondary feature.

                        S This user is from outside of this forum
                        S This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #148

                        The truck owners I know, myself included, use them all the time for towing and like the added utility having the bed as as secondary feature.

                        Then you put it beside a truck from 30 years ago that's a quarter the overall size but has the same bed capacity and towing power along with much better visibility instead of not being able to see the child you're about to run over. And then you understand what people mean when they say massive trucks - giant ridiculously unnecessary things that are all about being a status symbol and dodging regulations rather than practicality.

                        I 1 Reply Last reply
                        6
                        • J [email protected]

                          really depends. You can locally host an LLM on a typical gaming computer.

                          F This user is from outside of this forum
                          F This user is from outside of this forum
                          [email protected]
                          wrote on last edited by [email protected]
                          #149

                          True, and that's how everyone who is able should use AI, but OpenAI's models are in the trillion parameter range. That's 2-3 orders of magnitude more than what you can reasonably run yourself

                          J 1 Reply Last reply
                          2
                          • sabrew4k3@lazysoci.alS [email protected]
                            This post did not contain any content.
                            medicpigbabysaver@lemmy.worldM This user is from outside of this forum
                            medicpigbabysaver@lemmy.worldM This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #150

                            Climate change is unstoppable. Humanity is mostly doomed very, very soon.

                            So, fuck y'all, my two window units are running 24/7 @ 69F for the foreseeable future.

                            gandalf_der_12te@discuss.tchncs.deG C 2 Replies Last reply
                            7
                            • I [email protected]

                              Why do you want a subsidy for batteries?
                              Installing batteries at a large scale at homes is incredibly expensive compared to an off site battery. Especially with regards to the move towards hydrogen.

                              A This user is from outside of this forum
                              A This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #151

                              For the same reason we want to subsidize solar production in residential construction even though it's more efficient and cost-productive to do it at-scale. Having energy production and storage at the point of use reduces strain on power infrastructure and helps alleviate the types of load surging ayyy is talking about.

                              It's not a replacement for modernizing our power grids, too - it simply helps to make them more resilient.

                              I 1 Reply Last reply
                              2
                              • H [email protected]

                                And I guess they need it to be inefficient and expensive, so that it remains exclusive to them. That's why they were throwing a tantrum at Deepseek, because they proved it doesn't have to be.

                                B This user is from outside of this forum
                                B This user is from outside of this forum
                                [email protected]
                                wrote on last edited by [email protected]
                                #152

                                Bingo.

                                Altman et al want to kill open source AI for a monopoly.

                                This is what the entire AI research space already knew even before deepseek hit, and why they (largely) think so little of Sam Altman.

                                The real battle in the space is not AI vs no AI, but exclusive use by AI Bros vs. open models that bankrupt them. Which is what I keep trying to tell /c/fuck_ai, as the "no AI" stance plays right into the AI Bro's hands.

                                1 Reply Last reply
                                1
                                • medicpigbabysaver@lemmy.worldM [email protected]

                                  How about, fuck AI, end story.

                                  J This user is from outside of this forum
                                  J This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #153

                                  how about, fuck capitalism? Have you lost sight of the goal?

                                  M 1 Reply Last reply
                                  4
                                  • F [email protected]

                                    True, and that's how everyone who is able should use AI, but OpenAI's models are in the trillion parameter range. That's 2-3 orders of magnitude more than what you can reasonably run yourself

                                    J This user is from outside of this forum
                                    J This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by [email protected]
                                    #154

                                    This is still orders of magnitude less than what it takes to run an EV, which are an eco-friendly form of carbrained transportation. Especially if you live in an area where the power source is renewable. On that note, it looks to me like AI is finally going to be the impetus to get the U.S. to invest in and switch to nuclear power -- isn't that altogether a good thing for the environment?

                                    M 1 Reply Last reply
                                    2
                                    • track_shovel@slrpnk.netT [email protected]

                                      I'm really OOTL when it comes to AI GHG impact. How is it any worse than crypto farms, or streaming services?

                                      How do their outputs stack up to traditional emitters like Ag and industry? I need a measuring stick

                                      B This user is from outside of this forum
                                      B This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by [email protected]
                                      #155

                                      The UC paper above touches on that. I will link a better one if I find it.

                                      But specifically:

                                      streaming services

                                      Almost all the power from this is from internet infrastructure and the end device. Encoding videos (for them to be played thousands/millions of times) is basically free since its only done once, with the exception being YouTube (which is still very efficient). Storage servers can handle tons of clients (hence they're dirt cheap), and (last I heard) Netflix even uses local cache boxes to shorten the distance.

                                      TBH it must be less per capita than CRTs. Old TVs burned power like crazy.

                                      1 Reply Last reply
                                      1
                                      • sabrew4k3@lazysoci.alS [email protected]

                                        First off let me say, thanks for having this conversation, I'm enjoying it.

                                        Educational holidays are a concession and would have to be tested. So holiday goers would have to show they're attending lectures and visiting sites for the bulk of their visit. I honestly haven't fleshed out the idea as I just came up with it.

                                        But to talk about tourism, I think it was Prague that was able to showcase just how damaging tourism truly is. The city centre has miniscule local residency due to properties being brought up to lease as Airbnbs. With businesses attempting to target tourists, prices of food and travel increased and you know what didn't go up wages. So people were forced to move out of the city and commute in just to serve tourists things they can't afford. During tourist season, it's vibrant and busy, off-season it's a ghost town. The citizens aren't benefiting, it's exactly the opposite. Tourism is just imperialism flexing its muscles.

                                        Z This user is from outside of this forum
                                        Z This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #156

                                        Absolutely! Like I said, this is a topic I've always struggled with, and I've leaned both ways. I just so happen to be leaning on the side of recreational air travel this week lol.

                                        The example with Prague strikes me as rooted in capitalism, not so much tourism. Like, ideally governments (local or otherwise) in tourist-heavy areas step in and implement things that address those capitalistic problems you describe - penalize rental property conglomerates, enforce a liveable minimum wage, build affordable permanent housing and mixed-use spaces, etc. I hear your comparison between tourism and imperialism, and I get that some tourist areas are pretty awful where the local residents are treated as subhuman and that definitely sucks, but idk, it feels more like a capitalist/classist issue to me.

                                        1 Reply Last reply
                                        0
                                        • track_shovel@slrpnk.netT [email protected]

                                          I'm really OOTL when it comes to AI GHG impact. How is it any worse than crypto farms, or streaming services?

                                          How do their outputs stack up to traditional emitters like Ag and industry? I need a measuring stick

                                          B This user is from outside of this forum
                                          B This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #157

                                          Also, one other thing is that Nvidia clocks their GPUs (aka the world's AI accelerators) very inefficiently, because they have a pseudo monopoly, and they can.

                                          It doesn't have to be this way, and likely wont in the future.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups