OpenAI now tries to hide that ChatGPT was trained on copyrighted books, including J.K. Rowling’s Harry Potter series::A new research paper laid out ways in which AI developers should try and avoid showing LLMs have been trained on copyrighted material.

  • Tyler_Zoro@ttrpg.network
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    1 year ago

    AI/LLMs can train on whatever they want but when then these LLMs are used for commercial reasons to make money, an argument can be made that the copyrighted material has been used in a money making endeavour.

    And does this apply equally to all artists who have seen any of my work? Can I start charging all artists born after 1990, for training their neural networks on my work?

    Learning is not and has never been considered a financial transaction.

    • maynarkh@feddit.nl
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Actually, it has. The whole consept of copyright is relatively new, and corporations absolutely tried to have people who learned proprietary copyrighted information not be able to use it in other places.

      It’s just that labor movements got such non-compete agreements thrown out of our society, or at least severely restricted on humanitarian grounds. The argument is that a human being has the right to seek happiness by learning and using the proprietary information they learned to better their station. By the way, this needed a lot of violent convincing that we have this.

      So yes, knowledge and information learned is absolutely withing the scope of copyright as it stands, it’s only that the fundamental rights that humans have override copyright. LLMs (and companies for that matter) do not have such fundamental rights.

      Copyright by the way is stupid in its current implementation, but OpenAI and ChatGPT does not get to get out of it IMO just because it’s “learning”. We humans ourselves are only getting out of copyright because of our special legal status.

    • zbyte64@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      7
      ·
      1 year ago

      Ehh, “learning” is doing a lot of lifting. These models “learn” in a way that is foreign to most artists. And that’s ignoring the fact the humans are not capital. When we learn we aren’t building a form a capital; when models learn they are only building a form of capital.

      • Tyler_Zoro@ttrpg.network
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        1 year ago

        Artists, construction workers, administrative clerks, police and video game developers all develop their neural networks in the same way, a method simulated by ANNs.

        This is not, “foreign to most artists,” it’s just that most artists have no idea what the mechanism of learning is.

        The method by which you provide input to the network for training isn’t the same thing as learning.

        • Sentau@lemmy.one
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          1 year ago

          Artists, construction workers, administrative clerks, police and video game developers all develop their neural networks in the same way, a method simulated by ANNs.

          Do we know enough about how our brain functions and how neural networks functions to make this statement?

          • Yendor@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Do we know enough about how our brain functions and how neural networks functions to make this statement?

            Yes, we do. Take a university level course on ML if you want the long answer.

            • Sentau@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              My friends who took computer science told me that we don’t totally understand how machine learning algorithms work. Though this conversation was a few years ago in college. Will have to ask them again

          • Prager_U@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            This is orthogonal to the topic at hand. How does the chemistry of biological synapses alone result in a different type of learned model that therefore requires different types of legal treatment?

            The overarching (and relevant) similarity between biological and artificial nets is the concept of connectionist distributed representations, and the projection of data onto lower dimensional manifolds. Whether the network achieves its final connectome through backpropagation or a more biologically plausible method is beside the point.

      • Yendor@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        When we learn we aren’t building a form a capital; when models learn they are only building a form of capital.

        What do you think education is? I went to university to acquire knowledge and train my skills so that I could later be paid for those skills. That was literally building my own human capital.