• Swervish@lemmy.ml
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    10
    ·
    edit-2
    1 year ago

    Not trying to argue or troll, but I really don’t get this take, maybe I’m just naive though.

    Like yea, fuck Big Data, but…

    Humans do this naturally, we consume data, we copy data, sometimes for profit. When a program does it, people freak out?

    edit well fuck me for taking 10 minutes to write my comment, seems this was already said and covered as I was typing mine lol

    • QHC@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      10
      ·
      1 year ago

      It’s just a natural extension of the concept that entities have some kind of ownership of their creation and thus some say over how it’s used. We already do this for humans and human-based organizations, so why would a program not need to follow the same rules?

      • FaceDeer@kbin.social
        link
        fedilink
        arrow-up
        34
        arrow-down
        8
        ·
        1 year ago

        Because we don’t already do this. In fact, the raw knowledge contained in a copyrighted work is explicitly not copyrighted and can be done with as people please. Only the specific expression of that knowledge can be copyrighted.

        An AI model doesn’t contain the copyrighted works that went into training it. It only contains the concepts that were learned from it.

        • BURN@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          arrow-down
          16
          ·
          1 year ago

          There’s no learning of concepts. That’s why models hallucinate so frequently. They don’t “know” anything, they’re doing a lot of math based on what they’ve seen before and essentially taking the best guess at what the next word is.

          • SIGSEGV@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            10
            ·
            1 year ago

            Very much like humans do. Many people think that somehow their brain is special, but really, you’re just neurons behaving as neurons do, which can be modeled mathematically.

            • Hello Hotel@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              2
              ·
              1 year ago

              This take often denies that entropy soul or not is critically important for the types of intellegence thats not controlled by reward and punishment with an iron fist.

              • SIGSEGV@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 year ago

                It sounds like you know english words but cannot compose them. I honestly cannot parse what you said.

            • BURN@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              7
              ·
              1 year ago

              We can’t even map the entirety of the brain of a mouse due to the scale of how neurons work. Mapping a human brain 1:1 will eventually happen, and that’s likely going to coincide with when I’m convinced AI is capable of individual thought and actual intelligence

              • SIGSEGV@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                13
                arrow-down
                15
                ·
                edit-2
                1 year ago

                Just saw this today. You should check it out, nitwit: https://www.theguardian.com/science/2023/aug/15/scientists-reconstruct-pink-floyd-song-by-listening-to-peoples-brainwaves

                Edit: “nitwit” was uncalled for, but I do think you are an ignorant person.

                You aren’t magical. You don’t have a soul that talks to Jesus. You’re a bunch of organized electrical signals—a machine. Because your machine is carbon-based doesn’t make you special.

                Edit: Downvote all you want, but we’re all still animals. Most people don’t even believe that simple fact. Then again, most people don’t even understand how their cellphone works.

                • BURN@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  13
                  arrow-down
                  9
                  ·
                  1 year ago

                  I fundamentally disagree and if that’s your take on humanity I’m scared for our future.

                  There is a human element to us. I’m not spiritual at all. I believe when we die the lights just go out and we cease to exist. But there is undoubtedly a part of us that is still far from being replicated in a machine. I’m not saying it won’t happen, I’m saying we’re a long way from it and what we’re seeing out of current AI is nothing even close to resembling intelligence.

                  • SIGSEGV@sh.itjust.works
                    link
                    fedilink
                    English
                    arrow-up
                    11
                    arrow-down
                    6
                    ·
                    1 year ago

                    So when it happens, you’ll change your mind? My point is that what we have today is based on interactions in the human brain: neural networks. You can say, “They’re just guessing the next word based on mathematical models”, but isn’t that exactly what you’re doing?

                    Point to the reason why what comes out of your mouth is any different. Is it because your network is bigger and more complicated? If that’s the case GPT-4 is closer to being human than GPT-3 was, being a larger model.

                    I just don’t get your point at all.

        • adrian783@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          this is a super-reach, why don’t we deal with AI indistinguishable from human when it happens.

          right now what we have is a language model that is very distinguishable from human so it doesn’t get any human considerations.

          if a monkey or chicken created an artwork, it doesn’t have copyrights, because it’s not human either.

      • Hello Hotel@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        1 year ago

        I like that argument as it applies to our ai, which isnt ment to reject bad ideas or motiefs but to never have a bad idea in the first place. This setup results in the bot’s path of least resistance being to copy someones homework. Nobody wants the bot to do that.

        Someday we may have AI that argument is harder to apply to

        i attempt explain, irrelivant

        text generators have a “most correct” output that looks and behaves simmlar to pressing the first of the keyboard suggested words repeatedly. We add noise, where the bot is on a dice roll forced to add a random letter to it’s output. Like the above example if you typed a 5 letter word every so often instead.