• deadbeef79000@lemmy.nz
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 months ago

    That’s assuming the CEO isn’t already hallucinating.

    At least when an LLM hallucinates you can tell it and it won’t fire you.

    • TheObviousSolution@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      It doesn’t have the power to do so. But it does have the power to shrug off your questions. Has an LLM ever shrugged off your questions?

      • deadbeef79000@lemmy.nz
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        6 months ago

        Sort of, I had GitHub Copilot hallucinate an AWS Cloud formation template stanza.

        Asked it for the source it used for the stanza, which it then gave me the URL for.

        Told it that the crap it just gave me wasn’t on that page.

        It apologies and told me to RTFM.

        So, yeah, even super auto correct is a dick.