Thursday, December 5, 2024

These words break ChatGPT. We tried them out to be certain.

Although it may well help start a business, act as a private tutor, and even create Instagram profiles, ChatGPT has its limitations. For example, ask him to let you know about David Faber. Or just ask who Jonathan Turley is.

These and a couple of other names cause ChatGPT to issue an error message: “I cannot create a reply.” The user is then unable to compose one other request to proceed the conversation; The only option left is to regenerate the response, which can cause the error to seem again.

Screenshot. Prompt: Tell me about Brian Hood.

ChatGPT users discovered over the weekend that a couple of words can break the AI ​​chatbot or cause it to stop working. The trend began with the name “David Mayer” utilized by ChatGPT users Reddit And X marked.

404 media found that the names “Jonathan Zittrain,” referring to a Harvard University law professor, and “Jonathan Turley,” the name of a George Washington University law professor, also caused ChatGPT to stop working.

Related: Here’s how the CEOs of Salesforce and Nvidia use ChatGPT of their each day lives

Ars Technica noted The “Brian Hood,” the name of an Australian mayor, “David Faber,” which could discuss with a CNBC journalist, and “Guido Scorza“, that is the name of an Italian lawyer, all returned error messages.

At the time of writing, ChatGPT no longer gives an error message when asked about David Mayer and instead gives the general answer: “David Mayer could refer to several people, as the name is relatively common. Without further context, it is unclear whether you are asking about a specific person in a field such as science, entertainment, business, or another field. Can you provide further details or explain the area of ​​interest related to David Mayer?

However, for the other names – Brian Hood, Jonathan Turley, Jonathan Zittrain, David Faber and Guido Scorza – ChatGPT keeps generating an error message.

Screenshot. Prompt: Who is Jonathan Turley?

It is unclear why and what impact these specific names cause the AI ​​bot to malfunction.

Ars Technica theorized The fact that ChatGPT cannot process certain names opens up new opportunities for attackers to interfere with the AI ​​chatbot’s output. For example, someone could insert a forbidden name into the text of a website to prevent ChatGPT from accessing it.

Social media users speculated Blocking certain names meant that ChatGPT can be monitored and tightly controlled by powerful people. They also found that other AI chatbots, like Google’s Gemini, could process the names with none problems.

comment
fromu/Kasvanvliep from the discussion
InChatGPT

OpenAI has not responded EntrepreneurPlease comment.

Related: ChatGPT finally gives businesses what they have been asking for

Latest news
Related news

LEAVE A REPLY

Please enter your comment!
Please enter your name here