ChatGPT has a style over substance trick that seems to dupe people into thinking it’s smart, researchers found::Developers often prefer ChatGPT’s responses about code to those submitted by humans, despite the bot frequently being wrong, researchers found.

  • sj_zero@lotide.fbxl.net
    link
    fedilink
    arrow-up
    1
    ·
    11 months ago

    When it gets stuff wrong though, it doesn’t just get stuff wrong, it gets stuff completely made up. I’ve seen it create entire apis, I’ve seen it generate legal citations out of whole cloth and entire laws that don’t exist. I’ve seen it very confidently tell me to write a command that clearly doesn’t work and if it did then I wouldn’t be asking a question.

    But I don’t think that the alternative to chat GPT would even be stackoverflow, it would be an expert. Given the choice between the two, you would definitely want an expert every time.

    • sumofchemicals@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      You’re right that it completely fabricates stuff. And even with that reality, it improves my productivity, because I can take multiple swings and still be faster than googling. (And sometimes might just not find an answer googling)

      Of course you’ve got to know that’s how the tool works, and some people are hyping it and acting like it’s useful in all situations. And there are scenarios where I don’t know enough about the subject to begin with to ask the right question or realize how incorrect the answer it’s giving is.

      I only commented because you said you can’t get the correct answer, and that people don’t check the answer, both of which I know from my and my friends actual usage is not the case.