The public has scoffed at students for “cheating” their way through classes with AI homework helpers. Many feel as though it is dishonest to use technology for assignments, so how would those same people feel if they knew that reliable and credible scientific journals are starting to list one of these same “homework helpers” as a co-author of their research?
An artificial intelligence chatbot called ChatGPT has been listed as a fellow author on four different published papers. ChatGPT is a tool that utilizes a large language model and a vast bank of data. Its language model allows it to understand and generate text to sound like a natural human conversation. This feature makes it appealing to people who need to write an essay for class, a letter to a friend, and now, maybe a portion of a scientific journal. With this new surge of researchers using ChatGPT as a tool, other scientists are voicing their disapproval.
According to the journal Nature, experts and publishers have pointed out that ChatGPT does not meet the necessary requirements to be listed as an author of research. In order to be a co-author on a journal, the co-author needs to be able to be held accountable and responsible for the content of the paper. As an online tool with no ability to consent or defend the integrity of its work, ChatGPT does not fulfill the current standards held for authors. Richard Sever, the assistant director of Cold Spring Harbor Laboratory Press, pointed out that this issue has brought up our “need to distinguish the formal role of an author of a scholarly manuscript from the more general notion of an author as the writer of a document.” An author needs to be able to respond to any detection of scientific misconduct, behavior resembling plagiarism, false conclusions, or misleading data. While ChatGPT does have access to a large source of data, it has been proven to often make inaccurate statements, making it very liable to scientific misconduct.
In one instance, a group of researchers found that when they asked the chatbot to find citations that evaluate the association between liver cancer risk in Japan and coffee intake, it provided a title of a non-existent journal with a digital object identifier that actually correlated to a completely unrelated topic with different authors. The same group of researchers also found that when they asked for the top causes of death in Japan in the year of 2020, it responded with “1. Cancer: 29.5% of deaths, 2. Heart disease: 15.1% of deaths, 3. Pneumonia: 8.4% of deaths.” However, a quick google search found that the actual causes of death that year were malignant neoplasms at 27.3%, heart disease at 15%, and senility at 8.8%.
The lack of consistency with its accuracy is a concern, since fabricated publications could mislead society and have detrimental effects. ChatGPT only encompasses data up to the year 2021, so it can not be relied upon for current information, and it works by learning through AI trainers. Essentially, AI trainers have the chatbot generate responses which are then scored and fine-tuned for conversational purposes. The bot is not recommended to be used as a source for current information, as it is best used for its language abilities. It also responds differently depending on the phrasing of the question and sometimes even responds differently when asked the same question. Varying responses might be convenient for someone who needs help writing an email, but consistency tends to be crucial in scientific research. Reliable reproducibility is a core value in the scientific community, and the lack of this component causes a source to lose all credibility.
“Reliable reproducibility is a core value in the scientific community, and the lack of this component causes a source to lose all credibility.”
Despite the surge of opposition, some researchers believe that ChatGPT should be allowed to be used as a tool but properly cited as a source rather than as an author and thoroughly fact-checked. The AI program may be the key to new research, but for now many scientific journals, including Nature, will not publish a paper if this controversial author is listed.