• 4AV@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    1 year ago

    It doesn’t have “memory” of what it has generated previously, other than the current conversation. The answer you get from it won’t be much better than random guessing.

      • sep@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 year ago

        Ignoring the huge privacy/liabillity issue… there are other llm’s then chatgpt.

      • BetaDoggo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        The model is only trained to handle 4k tokens, roughly 2000 words depending on complexity. Even if it had a log of everything asked it wouldn’t be able to use any of it.