• @Cyberflunk@lemmy.world
    link
    fedilink
    English
    06 months ago

    Hallucinations, like depression, is a multifaceted issue. Training data is only a piece of it. Quantized models, overfitted training models rely on memory at the cost of obviously correct training data. Poorly structured Inferences can confuse a model.

    Rest assured, this isn’t just training data.

    • KillingTimeItself
      link
      fedilink
      English
      06 months ago

      yeah there’s also this stuff as well, though i consider that to be a more technical challenge, rather than a hard limit.