• kaffiene@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    18 days ago

    I’m 100% sure he can’t. Or at least, not from LLMs specifically. I’m not an expert so feel free to ignore my opinion but from what I’ve read, “hallucinations” are a feature of the way LLMs work.

    • rottingleaf@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      17 days ago

      One can have an expert system assisted by ML for classification. But that’s not an LLM.