• Joeffect@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    1 month ago

    I’m not an expert, but the whole basis of LLM not actually understanding words, just the likelihood of what word comes next basically seems like it’s not going to help progress it to the next level… Like to be an artificial general intelligence shouldn’t it know what words are?

    I feel like this path is taking a brick and trying to fit it into a keyhole…

    • metaStatic@kbin.earth
      link
      fedilink
      arrow-up
      13
      arrow-down
      3
      ·
      1 month ago

      learning is the basis of all known intelligence. LLMs have learned something very specific, AGI would need to be built by generalising the core functionality of learning not as an outgrowth of fully formed LLMs.

      and yes the current approach is very much using a brick to open a lock and that’s why it’s … ahem … hit a brick wall.

      • Joeffect@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 month ago

        Yeah, 20 something years ago when I was trying to learn PHP of all things, I really wanted to make a chat bot that could learn what words are… I barely got anywhere but I was trying to program the understanding of sentence structure and feeding it a dictionary of words… My goal was to have it output something on its own …

        I see these things become less resource intensive and hopefully running not on some random server…

        I found the files… It was closer to 15 years ago…

    • Pennomi@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 month ago

      Right, so AIs don’t really know what words are. All they see are tokens. The tokens could be words and letters, but they could also be image/video features, audio waveforms, or anything else.

    • TimeSquirrel@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 month ago

      More like taking a billion bricks and throwing it at that keyhole until one has a random shape that just happens to fit into the keyhole after 50 years of throwing bricks. After that point, every other brick you throw will be shaped similar to that, and most of them will work, until you encounter different keyhole.

      All while consuming the energy equivalent of small countries.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      1 month ago

      shouldn’t it know what words are?

      Not necessarily, but it should be smart enough to associate symbols with some form of meaning. It doesn’t do that, it juts associates symbols with related symbols, so if there’s nothing similar that already exists, it’s not going to be able to come back with anything sensible.

      I think being able to create new content with partial sample data is necessary to really be considered general AI. That’s what humans do, and we don’t necessarily need the words to describe it.