hallucination

artificial intelligence

Learn about this topic in these articles:

ChatGPT

  • Conversation with ChatGPT
    In ChatGPT

    …models, ChatGPT can sometimes “hallucinate,” a term used to describe the tendency for such models to respond with inaccurate or misleading information. For example, ChatGPT was asked to tell the Greek myth of Hercules and the ants. There is no such Greek myth; nevertheless, ChatGPT told a story of…

    Read More
  • Alan Turing
    In chatbot: Later developments

    …false or misleading data (termed hallucinations) or to potentially even plagiarize information.

    Read More