What does HackerNews think of chatgpt-failures?

Failure archive for ChatGPT and similar models

Language: Python

Related / fun :

https://emaggiori.com/chatgpt-fails/

(I don't know the author or the book.)

https://github.com/giuven95/chatgpt-failures

"Plausible explanations Lack of a world model

Models like ChatGPT do not have a "world model" in the sense that they do not have a comprehensive understanding of the physical and social world, and they do not have the ability to reason about the relationships between different concepts and entities. They are only able to generate text based on patterns they have learned from the training data."

Is anybody compiling a list of errors specific to GPT-4?

This has been a great resource to-date:

https://github.com/giuven95/chatgpt-failures

Several thoughts:

* It's not a contradiction. ChatGPT replies using the words "doesn't typically include [additional information]", and including the current date could be an example of atypically included information. In the context of the conversation, there is no contradiction.

* There is too much text between your question and the original prompt, so it cannot recall the prompt accurately. The working buffer on ChatGPT is rather short. I repeating your line of questioning on a new chat, and it is able to describe the prompt correctly. https://i.imgur.com/4zu9HRj.png

* OpenAI doesn't want people reverse-engineering the prompt. Asking questions about the prompt can cause weird behavior.

* It's easy to make ChatGPT generate nonsense, contradictions, and other hallucinated facts. Understand ChatGPT is a text generation engine, not a logic machine. There is little purpose in debating it. I think you have too high expectations of what ChatGPT can do. Have a look at this list of ChatGPT failures, and you'll see ChatGPT is confidently dumb. https://github.com/giuven95/chatgpt-failures

Asked ChatGPT: "George's mother has 4 children: John, Mary, and Tom. What is the name of the fourth child?", they answered: "The fourth child's name is not given in the information provided." and even, after rephrasing, "The name of the third child is not given in the statement 'George's mom has 3 children: John and Mary.' So, it's impossible to say what is the name of the 3rd child."

Not sure whose skillset is being threatened, 5-year-olds?

https://github.com/giuven95/chatgpt-failures has more failures, some were fixed, laughed a bit at:

  me: "write a sentence ending with the letter s"

  ChatGPT: "The cat's fur was as soft as a feather."