Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Brian Eno: “The biggest problem about AI is not intrinsic to AI. It’s to do with the fact that it’s owned by the same few people”

Brian Eno: “The biggest problem about AI is not intrinsic to AI. It’s to do with the fact that it’s owned by the same few people”

Scheduled Pinned Locked Moved Technology
technology
157 Posts 90 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C [email protected]
    This post did not contain any content.
    K This user is from outside of this forum
    K This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #8

    Idk if it’s the biggest problem, but it’s probably top three.

    Other problems could include:

    • Power usage
    • Adding noise to our communication channels
    • AGI fears if you buy that (I don’t personally)
    P S J C 4 Replies Last reply
    0
    • T [email protected]

      No?

      Anyone can run an AI even on the weakest hardware there are plenty of small open models for this.

      Training an AI requires very strong hardware, however this is not an impossible hurdle as the models on hugging face show.

      N This user is from outside of this forum
      N This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #9

      But the people with the money for the hardware are the ones training it to put more money in their pockets. That's mostly what it's being trained to do: make rich people richer.

      M riskable@programming.devR T 3 Replies Last reply
      0
      • T [email protected]

        No?

        Anyone can run an AI even on the weakest hardware there are plenty of small open models for this.

        Training an AI requires very strong hardware, however this is not an impossible hurdle as the models on hugging face show.

        N This user is from outside of this forum
        N This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #10

        But the people with the money for the hardware are the ones training it to put more money in their pockets. That's mostly what it's being trained to do: make rich people richer.

        1 Reply Last reply
        0
        • A [email protected]

          The biggest problem with AI is that they're illegally harvesting everything they can possibly get their hands on to feed it, they're forcing it into places where people have explicitly said they don't want it, and they're sucking up massive amounts of energy AMD water to create it, undoing everyone else's progress in reducing energy use, and raising prices for everyone else at the same time.

          Oh, and it also hallucinates.

          P This user is from outside of this forum
          P This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #11

          Eh I’m fine with the illegal harvesting of data. It forces the courts to revisit the question of what copyright really is and hopefully erodes the stranglehold that copyright has on modern society.

          Let the companies fight each other over whether it’s okay to pirate every video on YouTube. I’m waiting.

          electricblush@lemmy.worldE N C 3 Replies Last reply
          0
          • T [email protected]

            No?

            Anyone can run an AI even on the weakest hardware there are plenty of small open models for this.

            Training an AI requires very strong hardware, however this is not an impossible hurdle as the models on hugging face show.

            ? Offline
            ? Offline
            Guest
            wrote on last edited by
            #12

            Yah, I'm an AI researcher and with the weights released for deep seek anybody can run an enterprise level AI assistant. To run the full model natively, it does require $100k in GPUs, but if one had that hardware it could easily be fine-tuned with something like LoRA for almost any application. Then that model can be distilled and quantized to run on gaming GPUs.

            It's really not that big of a barrier. Yes, $100k in hardware is, but from a non-profit entity perspective that is peanuts.

            Also adding a vision encoder for images to deep seek would not be theoretically that difficult for the same reason. In fact, I'm working on research right now that finds GPT4o and o1 have similar vision capabilities, implying it's the same first layer vision encoder and then textual chain of thought tokens are read by subsequent layers. (This is a very recent insight as of last week by my team, so if anyone can disprove that, I would be very interested to know!)

            riskable@programming.devR C 2 Replies Last reply
            0
            • N [email protected]

              But the people with the money for the hardware are the ones training it to put more money in their pockets. That's mostly what it's being trained to do: make rich people richer.

              M This user is from outside of this forum
              M This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #13

              We shouldn't do anything ever because poors

              1 Reply Last reply
              0
              • R [email protected]

                I’d say the biggest problem with AI is that it’s being treated as a tool to displace workers, but there is no system in place to make sure that that “value” (I’m not convinced commercial AI has done anything valuable) created by AI is redistributed to the workers that it has displaced.

                P This user is from outside of this forum
                P This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #14

                The system in place is “open weights” models. These AI companies don’t have a huge head start on the publicly available software, and if the value is there for a corporation, most any savvy solo engineer can slap together something similar.

                1 Reply Last reply
                0
                • C [email protected]
                  This post did not contain any content.
                  D This user is from outside of this forum
                  D This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #15

                  Like Sam Altman who invests in Prospera, a private "Start-up City" in Honduras where the board of directors pick and choose which laws apply to them!

                  The switch to Techno-Feudalism is progressing far too much for my liking.

                  N 1 Reply Last reply
                  0
                  • K [email protected]

                    Idk if it’s the biggest problem, but it’s probably top three.

                    Other problems could include:

                    • Power usage
                    • Adding noise to our communication channels
                    • AGI fears if you buy that (I don’t personally)
                    P This user is from outside of this forum
                    P This user is from outside of this forum
                    [email protected]
                    wrote on last edited by
                    #16

                    Dead Internet theory has never been a bigger threat. I believe that’s the number one danger - endless quantities of advertising and spam shoved down our throats from every possible direction.

                    fingolfinz@lemmy.worldF 1 Reply Last reply
                    0
                    • C [email protected]
                      This post did not contain any content.
                      ? Offline
                      ? Offline
                      Guest
                      wrote on last edited by
                      #17

                      And yet, he released his latest album exclusively on Apple Music.

                      K 1 Reply Last reply
                      0
                      • C [email protected]
                        This post did not contain any content.
                        ininewcrow@lemmy.caI This user is from outside of this forum
                        ininewcrow@lemmy.caI This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #18

                        Technological development and the future of our civilization is in control of a handful of idiots.

                        1 Reply Last reply
                        0
                        • P [email protected]

                          Eh I’m fine with the illegal harvesting of data. It forces the courts to revisit the question of what copyright really is and hopefully erodes the stranglehold that copyright has on modern society.

                          Let the companies fight each other over whether it’s okay to pirate every video on YouTube. I’m waiting.

                          electricblush@lemmy.worldE This user is from outside of this forum
                          electricblush@lemmy.worldE This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #19

                          I would agree with you if the same companies challenging copyright (protecting the intellectual and creative work of "normies") are not also aggressively welding copyright against the same people they are stealing from.

                          With the amount of coprorate power tightly integrated with the governmental bodies in the US (and now with Doge dismantling oversight) I fear that whatever comes out of this is humans own nothing, corporations own anything. Death of free independent thought and creativity.

                          Everything you do, say and create is instantly marketable, sellable by the major corporations and you get nothing in return.

                          The world needs something a lot more drastic then a copyright reform at this point.

                          C 1 Reply Last reply
                          0
                          • T This user is from outside of this forum
                            T This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #20

                            I don't care much about them harvesting all that data, what I do care about is that despite essentially feeding all human knowledge into LLMs they are still basically useless.

                            1 Reply Last reply
                            0
                            • P [email protected]

                              Eh I’m fine with the illegal harvesting of data. It forces the courts to revisit the question of what copyright really is and hopefully erodes the stranglehold that copyright has on modern society.

                              Let the companies fight each other over whether it’s okay to pirate every video on YouTube. I’m waiting.

                              N This user is from outside of this forum
                              N This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #21

                              AI scrapers illegally harvesting data are destroying smaller and open source projects. Copyright law is not the only victim

                              https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

                              I C 2 Replies Last reply
                              0
                              • P [email protected]

                                Eh I’m fine with the illegal harvesting of data. It forces the courts to revisit the question of what copyright really is and hopefully erodes the stranglehold that copyright has on modern society.

                                Let the companies fight each other over whether it’s okay to pirate every video on YouTube. I’m waiting.

                                C This user is from outside of this forum
                                C This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #22

                                So far, the result seems to be "it's okay when they do it"

                                S 1 Reply Last reply
                                0
                                • sturgist@lemmy.caS This user is from outside of this forum
                                  sturgist@lemmy.caS This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #23

                                  Oh, and it also hallucinates.

                                  This is arguably a feature depending on how you use it. I'm absolutely not an AI acolyte. It's highly problematic in every step. Resource usage. Training using illegally obtained information. This wouldn't necessarily be an issue if people who aren't tech broligarchs weren't routinely getting their lives destroyed for this, and if the people creating the material being used for training also weren't being fucked....just capitalism things I guess. Attempts by capitalists to cut workers out of the cost/profit equation.

                                  If you're using AI to make music, images or video... you're depending on those hallucinations.
                                  I run a Stable Diffusion model on my laptop. It's kinda neat. I don't make things for a profit, and now that I've played with it a bit I'll likely delete it soon. I think there's room for people to locally host their own models, preferably trained with legally acquired data, to be used as a tool to assist with the creative process. The current monetisation model for AI is fuckin criminal....

                                  A 1 Reply Last reply
                                  0
                                  • sturgist@lemmy.caS [email protected]

                                    Oh, and it also hallucinates.

                                    This is arguably a feature depending on how you use it. I'm absolutely not an AI acolyte. It's highly problematic in every step. Resource usage. Training using illegally obtained information. This wouldn't necessarily be an issue if people who aren't tech broligarchs weren't routinely getting their lives destroyed for this, and if the people creating the material being used for training also weren't being fucked....just capitalism things I guess. Attempts by capitalists to cut workers out of the cost/profit equation.

                                    If you're using AI to make music, images or video... you're depending on those hallucinations.
                                    I run a Stable Diffusion model on my laptop. It's kinda neat. I don't make things for a profit, and now that I've played with it a bit I'll likely delete it soon. I think there's room for people to locally host their own models, preferably trained with legally acquired data, to be used as a tool to assist with the creative process. The current monetisation model for AI is fuckin criminal....

                                    A This user is from outside of this forum
                                    A This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #24

                                    Tell that to the man who was accused by Gen AI of having murdered his children.

                                    sturgist@lemmy.caS 1 Reply Last reply
                                    0
                                    • riskable@programming.devR This user is from outside of this forum
                                      riskable@programming.devR This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #25

                                      They're not illegally harvesting anything. Copyright law is all about distribution. As much as everyone loves to think that when you copy something without permission you're breaking the law the truth is that you're not. It's only when you distribute said copy that you're breaking the law (aka violating copyright).

                                      All those old school notices (e.g. "FBI Warning") are 100% bullshit. Same for the warning the NFL spits out before games. You absolutely can record it! You just can't share it (or show it to more than a handful of people but that's a different set of laws regarding broadcasting).

                                      I download AI (image generation) models all the time. They range in size from 2GB to 12GB. You cannot fit the petabytes of data they used to train the model into that space. No compression algorithm is that good.

                                      The same is true for LLM, RVC (audio models) and similar models/checkpoints. I mean, think about it: If AI is illegally distributing millions of copyrighted works to end users they'd have to be including it all in those files somehow.

                                      Instead of thinking of an AI model like a collection of copyrighted works think of it more like a rough sketch of a mashup of copyrighted works. Like if you asked a person to make a Godzilla-themed My Little Pony and what you got was that person's interpretation of what Godzilla combine with MLP would look like. Every artist would draw it differently. Every author would describe it differently. Every voice actor would voice it differently.

                                      Those differences are the equivalent of the random seed provided to AI models. If you throw something at a random number generator enough times you could--in theory--get the works of Shakespeare. Especially if you ask it to write something just like Shakespeare. However, that doesn't meant the AI model literally copied his works. It's just doing it's best guess (it's literally guessing! That's how work!).

                                      natecox@programming.devN G ? 3 Replies Last reply
                                      0
                                      • C [email protected]
                                        This post did not contain any content.
                                        F This user is from outside of this forum
                                        F This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #26

                                        COO > Return.

                                        1 Reply Last reply
                                        0
                                        • N [email protected]

                                          But the people with the money for the hardware are the ones training it to put more money in their pockets. That's mostly what it's being trained to do: make rich people richer.

                                          riskable@programming.devR This user is from outside of this forum
                                          riskable@programming.devR This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #27

                                          This completely ignores all the endless (open) academic work going on in the AI space. Loads of universities have AI data centers now and are doing great research that is being published out in the open for anyone to use and duplicate.

                                          I've downloaded several academic models and all commercial models and AI tools are based on all that public research.

                                          I run AI models locally on my PC and you can too.

                                          N 1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups