FAQs > What about using ChatGPT or other AI tools?

What about using ChatGPT or other AI tools?

As of 2023, ChatGPT (and other text generation AI/machine learning tools) don't do well with drafting Wikipedia article content. ChatGPT can generate plausible-sounding text about a topic in a somewhat encyclopedic style, along with realistic-looking lists of references (but without inline citations). However, there are typically extreme problems with factual accuracy, tone, and referencing — to the point that most raw ChatGPT-written content would be borderline spam on Wikipedia. ChatGPT was designed to produce realistic-sounding language, but it lacks a framework for facts and fact-based writing; many statements in typical ChatGPT output are simply incorrect. It is built from — and mirrors the language and style of — a wide swath of text on the internet, so — like the internet itself — it tends to be dominated by marketing material and promotional language for any topic that has a significant commercial aspect, rather than the neutral tone required by Wikipedia. Most importantly, ChatGPT is extremely bad at references; it can create a realistic-looking list of references, but any individual reference is likely to not even exist, and if it does exist, is likely to bear no relation to the text beyond a plausibly-relevant title.

If you use AI tools to create Wikipedia content without rigorous editing and fact-checking, you're likely to harm Wikipedia. You're also likely to get caught and have a very bad time.

See also: Wikipedia:Artificial_intelligence

We asked ChatGPT to write this FAQ entry, and here's what it produced:


Q: Why shouldn't I use ChatGPT for writing Wikipedia articles?

A: There are several reasons why using ChatGPT or other language models like it may not be the best approach for writing Wikipedia articles:

  1. Quality of the content: While ChatGPT can generate high-quality text, it may not always be accurate or reliable. Wikipedia articles are expected to be written with a high level of accuracy, and sources must be cited. Since ChatGPT has not been specifically trained on Wikipedia articles and does not have access to Wikipedia's internal policies and guidelines, it may not produce content that meets Wikipedia's standards for accuracy and reliability.
  2. Plagiarism: Wikipedia has a strict policy against plagiarism, which means that text from other sources must be properly attributed. Since ChatGPT generates text based on patterns it has learned from the internet, it may produce text that is similar to existing content. This could potentially result in plagiarism if the text is not properly attributed.
  3. Lack of understanding of Wikipedia's policies: ChatGPT is not aware of Wikipedia's internal policies and guidelines, such as its Neutral Point of View policy. As a result, it may generate text that is in violation of these policies and could be rejected by Wikipedia's editors.
  4. Lack of Human touch: There is a lot of human touch required to write Wikipedia Article. ChatGPT is an AI model and it may lack human touch which could be important in Wikipedia articles.
  5. Wikipedia articles should be written by human because of the importance of critical thinking, judgement and knowledge in generating Wikipedia articles. It could be a difficult task for an AI model like ChatGPT to find the right context and nuances of the subject.

In general, it may be best to use ChatGPT as a tool to assist with writing Wikipedia articles, rather than relying on it to produce the entire article. Additionally, it is important to always check the text generated by the model against Wikipedia's policies and guidelines, and to properly attribute any text that is used from other sources.

Can't find what you're looking for? Something wrong, missing, or confusing? Let us know.