Will chatbots write scientific essays? | | | | Turtles AI

Will chatbots write scientific essays?
Springer Nature, the world's largest academic publisher, has announced that AI writing tools like ChatGPT will not be credited as authors in scientific papers published in its thousands of journals. However, the company has stated that it has no issue with scientists using AI to help write or generate ideas for research, as long as the authors properly disclose the AI's contribution. This announcement comes after a small number of papers, preprints, and scientific articles have named ChatGPT and earlier large language models (LLMs) as authors. The nature and degree of the contribution of these tools varies case by case, with some papers clearly labeling the AI-generated text and others only acknowledging the program's "contribution to the writing of several sections of this manuscript." The scientific community has largely criticized the decision to credit AI as authors, with many arguing that software cannot fulfill the required duties, such as being accountable for a publication or claiming intellectual property rights for its work. Additionally, the output of AI writing software has been known to amplify social biases and produce "plausible bullshit." Springer Nature's Editor-in-Chief, Magdalena Skipper, says that the company's policy is clear on this issue, stating that "we don't prohibit their use as a tool in writing a paper." However, she emphasizes the importance of transparency in how a paper is put together and what software is used. She notes that AI tools can have a wide range of applications in scientific research, such as iterating experiment design or helping researchers for whom English is not their first language. Skipper also states that outright bans on the use of AI in scientific work would be ineffective. Instead, she believes that the scientific community should come together to establish new norms for disclosure and guidelines for the use of AI in research.