Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. Technology
  3. OpenAI declares AI race “over” if training on copyrighted works isn’t fair use

OpenAI declares AI race “over” if training on copyrighted works isn’t fair use

Scheduled Pinned Locked Moved Technology
technology
474 Posts 274 Posters 8 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • cyrano@lemmy.dbzer0.comC [email protected]
    This post did not contain any content.
    B This user is from outside of this forum
    B This user is from outside of this forum
    [email protected]
    wrote on last edited by
    #425

    I have conflicting feelings about this whole thing. If you are selling the result of training like OpenAI does (and every other company), then I feel like it’s absolutely and clearly not fair use. It’s just theft with extra steps.

    On the other hand, what about open source projects and individuals who aren’t selling or competing with the owners of the training material? I feel like that would be fair use.

    What keeps me up at night is if training is never fair use, then the natural result is that AI becomes monopolized by big companies with deep pockets who can pay for an infinite amount of random content licensing, and then we are all forever at their mercy for this entire branch of technology.

    The practical, socioeconomic, and ethical considerations are really complex, but all I ever see discussed are these hard-line binary stances that would only have awful corporate-empowering consequences, either because they can steal content freely or because they are the only ones that will have the resources to control the technology.

    P 1 Reply Last reply
    0
    • allo@sh.itjust.worksA [email protected]

      I think the da vinci stuff is a different discussion entirely as it has to do with comments about art and not someone publishing someone else's work for profit without consent while doing whatever they see fit to it. And generally that bullet seems slightly different from what I typed as my topic was theft of an artwork; not interpretation variation of viewers.

      I like the 50 shades of grey example and approve of her changing it to be it's own thing rather than either lose the effort put in to the fanfic or try to state it as twilight cannon without consent. Everything stated in that example feels good to me without triggering my immorality sensors.

      Sale of rights is nothing I have comments on at this current time.

      The babel program is an exotic 'independently coming to something'.

      I personally don't write fan fiction at all and it is easy to distinguish my written fiction from things ai's generate (at least with what ai is at this current time).

      I believe the key topic you hit is 'independently coming to things' and that that should be encouraged and is moral while using expired copyright law to take someone else's work without their consent is immoral. I do not profess to yet have an ideal system for this in mind; I would focus here though as it has potential to replace the immoral parts of the system with moral parts. So yes independently coming to something actually should receive positive feedback in comparison to purposely copying something the creator does not want copied.

      J This user is from outside of this forum
      J This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #426

      I don’t think you can separate art and interpretation and critique, but they are often done by different parties. You don’t have to have an opinion on everything. Fair enough. I thought your opinion was that you opposed the misrepresentation of what a piece of art was about, e.g. My Little Pony is about x not y. I merely wanted to know the nature and extent of that opinion.

      I agree on the 50 Shades front but am surprised—she took existing characters and wrote a new story around them, which both precludes the original author from ever writing anything in that vein and changes how those characters are seen. The facade of a name change is just that in my opinion.

      I’ll admit that I’m confused as to the scenario where you were using MLP AI but it’s not my business! If it was not in a fan fic vein though, I’ll point out that while you take issue with the AI including non-canon material in its MLP training data and thus being non-representative, the owners of the MLP intellectual property would take issue with the use of their material and being too representative. Copyright is not used to preserve sanctity, it is used to monopolize profit opportunity.

      The Babel program is merely representative of the actual library of Babel. Read the story. It’s short and it’s thoughtful.

      Consent is a valuable concept, not a magical one. If we declare that all creators own rights to their creations for 500 years who cares? Most everything created will be forgotten long before then, people who have never heard of Rachel Ingalls will create countless stories about a mute person meeting a sea creature, and she won’t have a thing to say about it because she’s dead, and she doesn’t seem to have said anything about Del Toro making his movie about the same damn thing. Or perhaps she doesn’t have access to the funds to fight for her claim to the story? Since the other issue is that copyright only protects people and corporations who sue every fractional and imagined impingement upon their property, and it’s not always up to you as the creator what that process looks like. If you get hurt in an accident your insurance company will probably sue whoever hurt you for damages, and likewise if you publish a book through Tantor Media and someone writes a thoughtful continuation you bet Tantor’s not asking for consent.

      Look at Star Wars. George Lucas creates a smash hit trilogy. People love it. They write tons of licensed material in-universe. He writes three more movies. They aaaare not a smash hit, but hey. People keep writing more tales in the extended universe. Who does this hurt? Fans get more material, writers make livings, Lucas makes money without having to do more work. But most creators do not make it so easy to create derivative works. Either they create more or their universe and characters die, and for whatever reason, that’s completely up to them. The absurd length of copyright claims ensures the magic their audience found in their work will whither away by the time someone who is willing to fan the flame is legally permitted to do so. Firefly will never resolve. Scavengers Reign is over, and if we catch you trying to finish the story you’ll face jail time. Westworld isn’t just unfinished, it’s functionally gone. It has been taken away. And those works were genuinely gargantuan undertakings and there is no way that was the desire of everyone involved.

      • Nothing comes to be something from nothing. Stephen King’s It has many things in common such as the seemingly sentient balloon with Ray Bradbury’s Something This Way Wicked Comes, who took its title from Macbeth, and says he was only really convinced to write it by his friend Gene Kelly—I do not think there is something inherently immoral about this iterative process of inspiration, creation, interpretation, amalgamation and recreation. I do think there is something inherently immoral about taking claiming “the buck stops here” and arguing for the total independence of your own work. It’s all borrowed from our experiences, and our experiences are borrowed from the universe, and when we die no one should really give a shit about whether or not we would consent to something if we were, you know, not dead. Stephen King may have a legal claim to It but it is not his work alone. Maybe a strong case for outsider art being unique could convince me otherwise but I do not believe we can come to a point of finality where, after we and everyone we've learned from and everyone who has fed us, led us, derided and inspired us has worked on something, after we've taken our materials from the planet and our inspiration from nature, we can say “it’s finished, and no one else may touch it.”

      Bonus material.

      allo@sh.itjust.worksA 2 Replies Last reply
      0
      • L [email protected]

        There are works that are free to use. They could also compensate copyright holders for their work. As they should since they are profiting from it.

        azalty@jlai.luA This user is from outside of this forum
        azalty@jlai.luA This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #427

        I'm all for compensating, but obviously paying for the full work will never work

        L 1 Reply Last reply
        0
        • O [email protected]

          In all of your replies, however, you fail to provide a single example. Are they writing code for you, or creating shitty art for you?

          ? Offline
          ? Offline
          Guest
          wrote on last edited by
          #428

          I have used them in a large variety of ways, from general knowledge seeking to specific knowledge seeking, writing code, generating audio, images, and video. I use it most days, if not essentially every day. What examples would you like me to provide? Tell me and I will provide them.

          1 Reply Last reply
          0
          • L [email protected]

            I've used them and have yet to get a fully correct result on anything I've asked beyond the absolute basics. I always have to go in and correct some aspect of whatever it shits out. Scraping every bit of data they can get their hands on is only making the problem worse.

            ? Offline
            ? Offline
            Guest
            wrote on last edited by
            #429

            To say you've never gotten a fully correct result on anything has to be hyperbole. These things are tested. We know their hallucination rate, and it's not 100%.

            L 1 Reply Last reply
            0
            • ? Guest

              who are these international and ideological competitors ? What happens if they develop it further than for profit corporations ?

              To me it seems like for profit corporations themselves are our international and ideological adversaries.

              ? Offline
              ? Offline
              Guest
              wrote on last edited by
              #430

              For example, the US and China are international and ideological competitors.

              1 Reply Last reply
              0
              • E [email protected]

                I fail to see the significance of not being dominant in bullshit generation, which is OpenAIs specialty.

                Non-LLM machine learning is more interesting, but "write me a poem about how you're my loving ai waifu" is just not a strategic resource.

                ? Offline
                ? Offline
                Guest
                wrote on last edited by
                #431

                I didn't think you've thought it through to it's logical conclusion, and your example shows a demeaning attitude towards what AI can currently be used for. When you reduce it to what you view as ridiculous uses, you become lacking in the facts.

                1 Reply Last reply
                0
                • B [email protected]

                  I have conflicting feelings about this whole thing. If you are selling the result of training like OpenAI does (and every other company), then I feel like it’s absolutely and clearly not fair use. It’s just theft with extra steps.

                  On the other hand, what about open source projects and individuals who aren’t selling or competing with the owners of the training material? I feel like that would be fair use.

                  What keeps me up at night is if training is never fair use, then the natural result is that AI becomes monopolized by big companies with deep pockets who can pay for an infinite amount of random content licensing, and then we are all forever at their mercy for this entire branch of technology.

                  The practical, socioeconomic, and ethical considerations are really complex, but all I ever see discussed are these hard-line binary stances that would only have awful corporate-empowering consequences, either because they can steal content freely or because they are the only ones that will have the resources to control the technology.

                  P This user is from outside of this forum
                  P This user is from outside of this forum
                  [email protected]
                  wrote on last edited by
                  #432

                  Japan already passed a law that explicitly allows training on copyrighted material. And many other countries just wouldn’t care. So if it becomes a real problem the companies will just move.

                  I think they need to figure out a middle ground where we can extract value from the for profit AI companies but not actually restrict the competition.

                  1 Reply Last reply
                  0
                  • L [email protected]

                    Technological advances are supposed to improve peoples lives. Allow them to work less and enjoy things more often.

                    It's why we invented a wheel. It's why we invented better weapons to hunt with.

                    "Tech for techs sake" is enjoying the technology and ignoring its impact on people's lives.

                    When a society creates a massive sum of information accessible to all, trains new technology on data created by that society, and then a small subset of that society steals and uses that data to profit themselves and themselves alone; I don't know what else you call that but exploitation.

                    Advances in AI should make our lives better. Not worse. Because of our economic model we have decided that technological advances no longer benefit everyone, but hurt a majority of the population for the profits of a few.

                    ? Offline
                    ? Offline
                    Guest
                    wrote on last edited by
                    #433

                    The AI is not the problem in this case. The economic model is. It is not an economic model suitable for the advancement of technology.

                    L 1 Reply Last reply
                    0
                    • O [email protected]

                      The fact that you can't distinguish between being against something vs. being against a double-standard is insane to me.

                      ? Offline
                      ? Offline
                      Guest
                      wrote on last edited by
                      #434

                      I just know where the future is headed. Gotta be realistic about it.

                      1 Reply Last reply
                      0
                      • cyrano@lemmy.dbzer0.comC [email protected]
                        This post did not contain any content.
                        kittenzrulz123@lemmy.blahaj.zoneK This user is from outside of this forum
                        kittenzrulz123@lemmy.blahaj.zoneK This user is from outside of this forum
                        [email protected]
                        wrote on last edited by
                        #435

                        1 Reply Last reply
                        0
                        • cyrano@lemmy.dbzer0.comC [email protected]
                          This post did not contain any content.
                          killeronthecorner@lemmy.worldK This user is from outside of this forum
                          killeronthecorner@lemmy.worldK This user is from outside of this forum
                          [email protected]
                          wrote on last edited by
                          #436

                          This is basically a veiled admission that OpenAI are falling behind in the very arms race they started. Good, fuck Altman. We need less ultra-corpo tech bro bullshit in prevailing technology.

                          1 Reply Last reply
                          0
                          • ? Guest

                            I’m not an american but losing in that area internationally might be way worse than to fight over training data.

                            Maybe not paying the full amount of the copyright, but I agree they should compensate the IP holders.

                            M This user is from outside of this forum
                            M This user is from outside of this forum
                            [email protected]
                            wrote on last edited by
                            #437

                            I don't believe there's a future in AI at all.

                            ? 1 Reply Last reply
                            0
                            • ? Guest

                              It's not shitty tech.

                              kolanaki@pawb.socialK This user is from outside of this forum
                              kolanaki@pawb.socialK This user is from outside of this forum
                              [email protected]
                              wrote on last edited by
                              #438

                              It is the shittiest tech. If you think this bullshit will actually lead to AGI, something that wouldn't be shit, you don't know much about LLMs or are incredibly delusional.

                              ? 1 Reply Last reply
                              0
                              • kolanaki@pawb.socialK [email protected]

                                It is the shittiest tech. If you think this bullshit will actually lead to AGI, something that wouldn't be shit, you don't know much about LLMs or are incredibly delusional.

                                ? Offline
                                ? Offline
                                Guest
                                wrote on last edited by
                                #439

                                LLMs are an implementation on the way to AGI.

                                1 Reply Last reply
                                0
                                • Z [email protected]

                                  So you believe there is no protection for creators at all and removing copyright will help them?

                                  B This user is from outside of this forum
                                  B This user is from outside of this forum
                                  [email protected]
                                  wrote on last edited by
                                  #440

                                  I believe that the protection copyright provides is proportionate to how much you can spend on lawyers. So, no protection for the smallest creators, and little protection for smaller creators against larger corporations.

                                  I support extreme copyright reform, though I doubt it should be completely removed.

                                  Z 1 Reply Last reply
                                  0
                                  • cyrano@lemmy.dbzer0.comC [email protected]
                                    This post did not contain any content.
                                    F This user is from outside of this forum
                                    F This user is from outside of this forum
                                    [email protected]
                                    wrote on last edited by
                                    #441

                                    No amigo, it's not fair if you're profiting from it in the long run.

                                    1 Reply Last reply
                                    0
                                    • cyrano@lemmy.dbzer0.comC [email protected]
                                      This post did not contain any content.
                                      C This user is from outside of this forum
                                      C This user is from outside of this forum
                                      [email protected]
                                      wrote on last edited by
                                      #442

                                      Good. I hope this is what happens.

                                      1. LLM algorithms can be maintained and sold to corpos to scrape their own data so they can use them for in house tools, or re-sell them to their own clients.
                                      2. Open Source LLMs can be made available for end users to do the same with their own data, or scrape whats available in the public domain for whatever they want so long as they don't re-sell
                                      3. Altman can go fuck himself
                                      1 Reply Last reply
                                      0
                                      • cyrano@lemmy.dbzer0.comC [email protected]
                                        This post did not contain any content.
                                        bruhssa@lemmy.worldB This user is from outside of this forum
                                        bruhssa@lemmy.worldB This user is from outside of this forum
                                        [email protected]
                                        wrote on last edited by
                                        #443

                                        These fuckers are the first one to send tons of lawyers whenever you republish or use any IP of them. Fuck these idiots.

                                        1 Reply Last reply
                                        0
                                        • cyrano@lemmy.dbzer0.comC [email protected]
                                          This post did not contain any content.
                                          ? Offline
                                          ? Offline
                                          Guest
                                          wrote on last edited by
                                          #444

                                          Good. If I ever published anything, I would absolutely not want it to be pirated by AI so some asshole can plagiarize it later down the line and not even cite their sources.

                                          1 Reply Last reply
                                          0
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Login or register to search.
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups