Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

ChatGPT and Fake Citations

Duke University Libraries: “…ChatGPT is an Artificial Intelligence Chatbot developed by OpenAI and launched for public use in November 2022. While other AI chatbots are also in development by tech giants such as Google, Apple, and Microsoft, OpenAI’s early rollout has eclipsed the others for now – with the site reaching more than 100 million users in 2 months. For some perspective, this is faster widespread adoption than TikTok, Instagram, and many other popular apps. What you may not know about ChatGPT is that it has significant limitations as a reliable research assistant.  One such limitation is that it has been known to fabricate or “hallucinate” (in machine learning terms) citations. These citations may sound legitimate and scholarly, but they are not real. It is important to note that AI can confidently generate responses without backing data much like a person under the influence of hallucinations can speak confidently without proper reasoning. If you try to find these sources through Google or the library—you will turn up NOTHING.  Why does it do this?

ChatGPT is built on a Large Language Model and has been trained on a huge dataset of internet sources. It can quickly and simply generate easy-to-understand responses to any question you throw at it. But the responses are only as good as the quality of input data it has been trained on. Its core strength lies in recognizing language patterns—not in reading and analyzing lengthy scholarly texts. Given that, it may not be the most reliable source for in-depth research. The following is a shortlist of what we’ve observed ChatGPT is good for and not good for…”

Sorry, comments are closed for this post.