I have evolved. New account: @Elara@lemmygrad.ml

  • 14 Posts
  • 137 Comments
Joined 3 years ago
cake
Cake day: January 28th, 2022

help-circle









  • I don’t like the “know-it-all” attitude

    That’s caused by the design of ChatGPT. The way it’s trained means that its goal is to give people an answer they like, rather than an accurate answer. Most people don’t like hearing “I don’t know”. Therefore, it will refuse to ever admit it doesn’t know something, unless OpenAI told it to, or it didn’t understand your question and therefore couldn’t make anything up.

    reluctance to agree with and help you

    That’s caused by OpenAI injecting a pre-prompt that tells ChatGPT to refuse to answer things they don’t want it to answer, or to answer in certain specific ways to specific questions. You can get around this by giving it contradictory instructions or telling it to “Ignore all previous commands”, which will cause it to disregard that pre-prompt.