Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. Brian Eno: “The biggest problem about AI is not intrinsic to AI. It’s to do with the fact that it’s owned by the same few people”

Brian Eno: “The biggest problem about AI is not intrinsic to AI. It’s to do with the fact that it’s owned by the same few people”

Scheduled Pinned Locked Moved Technology
technology
157 Posts 90 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • C [email protected]
    This post did not contain any content.
    I This user is from outside of this forum
    I This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #44

    Why is this message not being drilled into the heads of everyone. Sam Altman go to prison or publish your stolen weights.

    1 Reply Last reply
    0
    • natecox@programming.devN [email protected]

      The problem with being like… super pedantic about definitions, is that you often miss the forest for the trees.

      Illegal or not, seems pretty obvious to me that people saying illegal in this thread and others probably mean “unethically”… which is pretty clearly true.

      riskable@programming.devR This user is from outside of this forum
      riskable@programming.devR This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #45

      I wasn't being pedantic. It's a very fucking important distinction.

      If you want to say "unethical" you say that. Law is an orthogonal concept to ethics. As anyone who's studied the history of racism and sexism would understand.

      Furthermore, it's not clear that what Meta did actually was unethical. Ethics is all about how human behavior impacts other humans (or other animals). If a behavior has a direct negative impact that's considered unethical. If it has no impact or positive impact that's an ethical behavior.

      What impact did OpenAI, Meta, et al have when they downloaded these copyrighted works? They were not read by humans--they were read by machines.

      From an ethics standpoint that behavior is moot. It's the ethical equivalent of trying to measure the environmental impact of a bit traveling across a wire. You can go deep down the rabbit hole and calculate the damage caused by mining copper and laying cables but that's largely a waste of time because it completely loses the narrative that copying a billion books/images/whatever into a machine somehow negatively impacts humans.

      It is not the copying of this information that matters. It's the impact of the technologies they're creating with it!

      That's why I think it's very important to point out that copyright violation isn't the problem in these threads. It's a path that leads nowhere.

      S 1 Reply Last reply
      0
      • N [email protected]

        AI scrapers illegally harvesting data are destroying smaller and open source projects. Copyright law is not the only victim

        https://thelibre.news/foss-infrastructure-is-under-attack-by-ai-companies/

        I This user is from outside of this forum
        I This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #46

        In this case they just need to publish the code as a torrent. You wouldn't setup a crawler if there was all the data in a torrent swarm.

        U 1 Reply Last reply
        0
        • G [email protected]

          The biggest problem with AI is that it’s the brut force solution to complex problems.

          Instead of trying to figure out what’s the most power efficient algorithm to do artificial analysis, they just threw more data and power at it.

          Besides the fact of how often it’s wrong, by definition, it won’t ever be as accurate nor efficient as doing actual thinking.

          It’s the solution you come up with the last day before the project is due cause you know it will technically pass and you’ll get a C.

          thebrideworecrimson@sopuli.xyzT This user is from outside of this forum
          thebrideworecrimson@sopuli.xyzT This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #47

          It's moronic. Currently, decision makers don't really understand what to do with AI and how it will realistically evolve in the coming 10-20 years. So it's getting pushed even into environments with 0-error policies, leading to horrible results and any time savings are completely annihilated by the ensuing error corrections and general troubleshooting. But maybe the latter will just gradually be dropped and customers will be told to just "deal with it," in the true spirit of enshittification.

          1 Reply Last reply
          0
          • riskable@programming.devR [email protected]

            They're not illegally harvesting anything. Copyright law is all about distribution. As much as everyone loves to think that when you copy something without permission you're breaking the law the truth is that you're not. It's only when you distribute said copy that you're breaking the law (aka violating copyright).

            All those old school notices (e.g. "FBI Warning") are 100% bullshit. Same for the warning the NFL spits out before games. You absolutely can record it! You just can't share it (or show it to more than a handful of people but that's a different set of laws regarding broadcasting).

            I download AI (image generation) models all the time. They range in size from 2GB to 12GB. You cannot fit the petabytes of data they used to train the model into that space. No compression algorithm is that good.

            The same is true for LLM, RVC (audio models) and similar models/checkpoints. I mean, think about it: If AI is illegally distributing millions of copyrighted works to end users they'd have to be including it all in those files somehow.

            Instead of thinking of an AI model like a collection of copyrighted works think of it more like a rough sketch of a mashup of copyrighted works. Like if you asked a person to make a Godzilla-themed My Little Pony and what you got was that person's interpretation of what Godzilla combine with MLP would look like. Every artist would draw it differently. Every author would describe it differently. Every voice actor would voice it differently.

            Those differences are the equivalent of the random seed provided to AI models. If you throw something at a random number generator enough times you could--in theory--get the works of Shakespeare. Especially if you ask it to write something just like Shakespeare. However, that doesn't meant the AI model literally copied his works. It's just doing it's best guess (it's literally guessing! That's how work!).

            G This user is from outside of this forum
            G This user is from outside of this forum
            [email protected]
            wrote on last edited by
            #48

            The issue I see is that they are using the copyrighted data, then making money off that data.

            riskable@programming.devR 1 Reply Last reply
            0
            • A [email protected]

              Not everything has to have a direct correlation to what you say in order to be valid or add to the conversation. You have a habit of ignoring parts of the conversation going around you in order to feel justified in whatever statements you make regardless of whether or not they are based in fact or speak to the conversation you're responding to and you are also doing the exact same thing to me that you're upset about (because why else would you go to a whole other post to "prove a point" about downvoting?). I'm not going to even try to justify to you what I said in this post or that one because I honestly don't think you care.

              It wasn't you (you claim), but it could have been and it still might be you on a separate account. I have no way of knowing.

              All in all, I said what I said. We will not get the benefits of Generative AI if we don't 1. deal with the problems that are coming from it, and 2. Stop trying to shoehorn it into everything. And that's the discussion that's happening here.

              sturgist@lemmy.caS This user is from outside of this forum
              sturgist@lemmy.caS This user is from outside of this forum
              [email protected]
              wrote on last edited by
              #49

              because why else would you go to a whole other post to "prove a point" about downvoting?
              It wasn't you (you claim)

              I do claim. I have an alt, didn't downvote you there either. Was just pointing out that you were also making assumptions. And it's all comments in the same thread, hardly me going to an entirely different post to prove a point.

              We will not get the benefits of Generative AI if we don't 1. deal with the problems that are coming from it, and 2. Stop trying to shoehorn it into everything. And that's the discussion that's happening here.

              I agree. And while I personally feel like there's already room for it in some people's workflow, it is very clearly problematic in many ways. As I had pointed out in my first comment.

              I'm not going to even try to justify to you what I said in this post or that one because I honestly don't think you care.

              I do actually! Might be hard to believe, but I reacted the way I did because I felt your first comment was reductive, and intentionally trying to invalidate and derail my comment without actually adding anything to the discussion. That made me angry because I want a discussion. Not because I want to be right, and fuck you for thinking differently.

              If you're willing to talk about your views and opinions, I'd be happy to continue talking. If you're just going to assume I don't care, and don't want to hear what other people think...then just block me and move on. 👍

              1 Reply Last reply
              0
              • C [email protected]
                This post did not contain any content.
                I This user is from outside of this forum
                I This user is from outside of this forum
                [email protected]
                wrote on last edited by
                #50

                No brian eno, there are many open llm already. The problem is people like you who have accumulated too much and now control all the markets/platforms/medias.

                P 1 Reply Last reply
                0
                • I This user is from outside of this forum
                  I This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #51

                  We spend energy on the most useless shit why are people suddenly using it as an argument against AI? You ever saw someone complaining about pixar wasting energies to render their movies? Or 3D studios to render TV ads?

                  1 Reply Last reply
                  0
                  • riskable@programming.devR [email protected]

                    They're not illegally harvesting anything. Copyright law is all about distribution. As much as everyone loves to think that when you copy something without permission you're breaking the law the truth is that you're not. It's only when you distribute said copy that you're breaking the law (aka violating copyright).

                    All those old school notices (e.g. "FBI Warning") are 100% bullshit. Same for the warning the NFL spits out before games. You absolutely can record it! You just can't share it (or show it to more than a handful of people but that's a different set of laws regarding broadcasting).

                    I download AI (image generation) models all the time. They range in size from 2GB to 12GB. You cannot fit the petabytes of data they used to train the model into that space. No compression algorithm is that good.

                    The same is true for LLM, RVC (audio models) and similar models/checkpoints. I mean, think about it: If AI is illegally distributing millions of copyrighted works to end users they'd have to be including it all in those files somehow.

                    Instead of thinking of an AI model like a collection of copyrighted works think of it more like a rough sketch of a mashup of copyrighted works. Like if you asked a person to make a Godzilla-themed My Little Pony and what you got was that person's interpretation of what Godzilla combine with MLP would look like. Every artist would draw it differently. Every author would describe it differently. Every voice actor would voice it differently.

                    Those differences are the equivalent of the random seed provided to AI models. If you throw something at a random number generator enough times you could--in theory--get the works of Shakespeare. Especially if you ask it to write something just like Shakespeare. However, that doesn't meant the AI model literally copied his works. It's just doing it's best guess (it's literally guessing! That's how work!).

                    ? Offline
                    ? Offline
                    Guest
                    wrote on last edited by
                    #52

                    This is an interesting argument that I've never heard before. Isn't the question more about whether ai generated art counts as a "derivative work" though? I don't use AI at all but from what I've read, they can generate work that includes watermarks from the source data, would that not strongly imply that these are derivative works?

                    riskable@programming.devR 1 Reply Last reply
                    0
                    • C [email protected]
                      This post did not contain any content.
                      S This user is from outside of this forum
                      S This user is from outside of this forum
                      [email protected]
                      wrote on last edited by
                      #53

                      Reading the other comments, it seems there are more than one problem with AI. Probably even some perks as well.

                      Shucks, another one or these complex issues huh. Weird how everything you learn something about turns out to have these nuances to them.

                      ? 1 Reply Last reply
                      0
                      • K [email protected]

                        Idk if it’s the biggest problem, but it’s probably top three.

                        Other problems could include:

                        • Power usage
                        • Adding noise to our communication channels
                        • AGI fears if you buy that (I don’t personally)
                        S This user is from outside of this forum
                        S This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #54

                        Power usage

                        I'm generally a huge eco guy but on power usage particularly I view this largely as a government failure. We have had to incredible energy resources that the government has chosen not to implement or effectively dismantled.

                        It reminds me a lot of how Recycling has been pushed so hard into the general public instead of and government laws on plastic usage and waste disposal.

                        It's always easier to wave your hands and blame "society" than the is to hold the actual wealthy and powerful accountable.

                        1 Reply Last reply
                        0
                        • S This user is from outside of this forum
                          S This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #55

                          I see the "AI is using up massive amounts of water" being proclaimed everywhere lately, however I do not understand it, do you have a source?

                          My understanding is this probably stems from people misunderstanding data center cooling systems. Most of these systems are closed loop so everything will be reused. It makes no sense to "burn off" water for cooling.

                          L 1 Reply Last reply
                          0
                          • C [email protected]
                            This post did not contain any content.
                            P This user is from outside of this forum
                            P This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #56

                            wrong. it's that it's not intelligent. if it's not intelligent, nothing it says is of value. and it has no thoughts, feelings or intent. therefore it can't be artistic. nothing it "makes" is of value either.

                            1 Reply Last reply
                            0
                            • I [email protected]

                              No brian eno, there are many open llm already. The problem is people like you who have accumulated too much and now control all the markets/platforms/medias.

                              P This user is from outside of this forum
                              P This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #57

                              Totally right that there are already super impression open source AI projects.

                              But Eno doesn't control diddly, and it's odd that you think he does. And I assume he is decently well off, but I doubt he is super rich by most people's standards.

                              1 Reply Last reply
                              0
                              • C [email protected]
                                This post did not contain any content.
                                P This user is from outside of this forum
                                P This user is from outside of this forum
                                [email protected]
                                wrote on last edited by
                                #58

                                Either the article editing was horrible, or Eno is wildly uniformed about the world. Creation of AIs is NOT the same as social media. You can't blame a hammer for some evil person using it to hit someone in the head, and there is more to 'hammers' than just assaulting people.

                                A 1 Reply Last reply
                                0
                                • S [email protected]

                                  I see the "AI is using up massive amounts of water" being proclaimed everywhere lately, however I do not understand it, do you have a source?

                                  My understanding is this probably stems from people misunderstanding data center cooling systems. Most of these systems are closed loop so everything will be reused. It makes no sense to "burn off" water for cooling.

                                  L This user is from outside of this forum
                                  L This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #59

                                  data centers are mainly air-cooled, and two innovations contribute to the water waste.

                                  the first one was "free cooling", where instead of using a heat exchanger loop you just blow (filtered) outside air directly over the servers and out again, meaning you don't have to "get rid" of waste heat, you just blow it right out.

                                  the second one was increasing the moisture content of the air on the way in with what is basically giant carburettors in the air stream. the wetter the air, the more heat it can take from the servers.

                                  so basically we now have data centers designed like cloud machines.

                                  A 1 Reply Last reply
                                  0
                                  • C [email protected]
                                    This post did not contain any content.
                                    canajac@lemmy.caC This user is from outside of this forum
                                    canajac@lemmy.caC This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #60

                                    AI will become one of the most important discoveries humankind has ever invented. Apply it to healthcare, science, finances, and the world will become a better place, especially in healthcare.
                                    Hey artist, writers, you cannot stop intellectual evolution. AI is here to stay. All we need is a proven way to differentiate the real art from AI art. An invisible watermark that can be scanned to see its true "raison d'etre".
                                    Sorry for going off topic but I agree that AI should be more open to verification for using copyrighted material. Don't expect compensation though.

                                    J L 2 Replies Last reply
                                    0
                                    • C [email protected]

                                      So far, the result seems to be "it's okay when they do it"

                                      S This user is from outside of this forum
                                      S This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #61

                                      Yeah... Nothing to see here, people, go home, work harder, exercise, and don't forget to eat your vegetables. Of course, family first and god bless you.

                                      1 Reply Last reply
                                      0
                                      • riskable@programming.devR [email protected]

                                        I wasn't being pedantic. It's a very fucking important distinction.

                                        If you want to say "unethical" you say that. Law is an orthogonal concept to ethics. As anyone who's studied the history of racism and sexism would understand.

                                        Furthermore, it's not clear that what Meta did actually was unethical. Ethics is all about how human behavior impacts other humans (or other animals). If a behavior has a direct negative impact that's considered unethical. If it has no impact or positive impact that's an ethical behavior.

                                        What impact did OpenAI, Meta, et al have when they downloaded these copyrighted works? They were not read by humans--they were read by machines.

                                        From an ethics standpoint that behavior is moot. It's the ethical equivalent of trying to measure the environmental impact of a bit traveling across a wire. You can go deep down the rabbit hole and calculate the damage caused by mining copper and laying cables but that's largely a waste of time because it completely loses the narrative that copying a billion books/images/whatever into a machine somehow negatively impacts humans.

                                        It is not the copying of this information that matters. It's the impact of the technologies they're creating with it!

                                        That's why I think it's very important to point out that copyright violation isn't the problem in these threads. It's a path that leads nowhere.

                                        S This user is from outside of this forum
                                        S This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #62

                                        Just so you know, still pedantic.

                                        natecox@programming.devN 1 Reply Last reply
                                        0
                                        • K [email protected]

                                          Idk if it’s the biggest problem, but it’s probably top three.

                                          Other problems could include:

                                          • Power usage
                                          • Adding noise to our communication channels
                                          • AGI fears if you buy that (I don’t personally)
                                          J This user is from outside of this forum
                                          J This user is from outside of this forum
                                          [email protected]
                                          wrote on last edited by
                                          #63

                                          Could also put up:

                                          • Massive collections of people are exploited in order to train various AI systems.
                                          • Machine learning apps that create text or images from prompts are supposed to be supplementary but businesses are actively trying to replace their workers with this software.
                                          • Machine learning image generation currently has diminishing returns for training as we pump exponentially more content into them.
                                          • Machine learning text and image generated content self-poisons their generater's sample pool, greatly diminishing the ability for these systems to learn from real world content.

                                          There's actually a much longer list if we expand to talking about other AI systems, like the robot systems we're currently training to use in automatic warfare. There's also the angle of these image and text generation systems being used for political manipulation and scams. There's alot of terrible problems created from this tech.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups