Doctors recently warned against using ChatGPT, a new AI chatbot, for medical advice due to a study that found it making up health data when asked for information about cancer. The study conducted by researchers from the University of Maryland School of Medicine revealed that up to one in ten questions related to breast cancer screening it answered wrongly, with its responses being found to be much less ‘comprehensive’ than a simple Google Search. Additionally, the AI chatbot even used fake journal articles to support its incorrect claims.
Given this concerning issue, technology experts are calling for a stop to the ‘arms race’ of sprinting to develop powerful AI systems, fearing for the risks it may pose to humanity. While a complete halt to the advancement in AI may not be the most appropriate solution, it is clear that its use must be approached with caution.
Microsoft, one of the biggest names in the tech world, has invested heavily in ChatGPT and incorporated the software into its search engine, Bing, as well as Office 365 including Word, PowerPoint, and Excel. The software giant has set realistic expectations and has admitted that ChatGPT may make mistakes as AI experts refer to this phenomenon as “hallucination”, in which the chatbot may make up answers when it is unable to find an answer that it was trained on.
Nevertheless, despite its difficulties, ChatGPT was able to correctly answer most questions, such as those on the symptoms of breast cancer and its risk factors, as well as questions on the cost, age, and frequency of recommendations concerning mammograms. Furthermore, it was also able to summarize this information in a more easily digestible form.
Dr. Paul Yi, one of the Co-Authors of the study, suggested a balanced view of ChatGPT based on the result, highlighting that the proportion of correct answers was ‘pretty amazing’. He reminded users, however, to always consult with a medical professional for any medical advice as opposed to relying solely on AI chatbots as these are still relatively new and unproven technologies.