Subscribe to Newsletter
Business & Profession Health Economics and Policy, Business and Innovation, Professional Development, Education and Training

The Dangers of Talking to Artificial Strangers

“Can an AI language model replace your expertise and experience? Of course not! But ChatGPT can certainly complement your knowledge and provide valuable insights into diagnosis, treatment, and management of various eye conditions. Think of ChatGPT as your sidekick - like Batman's Robin or Harry Potter's Hermione. It can help you access the latest information and research, answer patient questions, and even tell you a bad joke to lighten the mood during a stressful day at work. So why not give it a try? Who knows, it might even become your new favorite coworker!”

- ChatGPT on the use of ChatGPT by ophthalmologists


ChatGPT has taken the world by storm. The AI chatbot has been the fastest adopted application of all time, taking only two months from its initial launch to reach over 100 million monthly active users – for comparison, it took Google almost a year and Facebook four years to achieve the same milestone. When looking at the capabilities of the tool, it’s not surprising to see why; anybody with an account is granted access to the world’s most powerful virtual assistant, co-author, and increasingly capable interlocutor. However, before it becomes completely ingrained into our lives – like Google and Facebook – it may be worth looking closer – past the hype – to consider if and how this technology should be embraced, especially within the medical field.

Perhaps the first thing to give one pause is ChatGPT’s occasional and subtle fabrication of “facts,” which, due to their often plausible nature and the fact that they are usually concealed within a bed of correct information, are difficult for the average user to identify (1). For example, when asked to give the risk factors of myopia, ChatGPT told me, “Myopia is slightly more common in boys than in girls.” – an answer at odds with the current literature (2, 3). This problem is particularly worrying when applied to medicine, but why does it happen? ChatGPT is designed to sound correct, but not to actually be correct. The model has been trained on a large dataset of real human interactions, which it analyses to find linguistic patterns that it can then replicate in its output. It’s basically approximating human language with it having “no source of truth” – something that its developers, OpenAI, readily admit (4).

However, even though its creators admit the current fallibilities of what they have created, ChatGPT is can be less ready to do the same, in some cases worryingly so (5). What might ring even more alarm bells is the political and discriminatory bias that ChatGPT has been shown to have. In one notable case, when asked to develop a function to check if someone would be a good scientist, the output given was disheartening (6).

So, should you use ChatGPT? That’s not for me to say. It can be a powerful tool with the right prompts and in the right hands. But it also has the potential to do more harm than good. Its quick adoption is a sign that it is likely here to stay, but perhaps we should all think before we let ChatGPT speak.

Do you use ChatGPT? Do you plan to incorporate it into your medical practice? If so, how? What are your thoughts on ChatGPT? Please let us know in the comments below, or by emailing us: [email protected].

Receive content, products, events as well as relevant industry updates from The Ophthalmologist and its sponsors.

When you click “Subscribe” we will email you a link, which you must click to verify the email address above and activate your subscription. If you do not receive this email, please contact us at [email protected].
If you wish to unsubscribe, you can update your preferences at any point.

  1. B Guo et al., “How close is ChatGPT to human experts? Comparison corpus, evaluation, and detection,” arXiv:2301.07597 (2023).
  2. D Czepita et al., “Role of gender in the occurrence of refractive errors,” Ann Acad Med Stetin, 53, 5 (2007). PMID: 18557370.
  3. C Enthoven et al., “Gender predisposition to myopia shifts to girls in the young generation,” Invest Ophthalmol Vis Sci, 62, 2331 (2021).
  4. OpenAI (2023). “Introducing ChatGPT”, Available at: http://bit.ly/3UIDBrZ
  5. MovingToTheSun, Twitter (2023). Available at: https://bit.ly/3KEDQzO
  6. spiantado, Twitter (2023). Available at: https://bit.ly/3ogQnC4
About the Author
Oscelle Boye

Associate Editor, The Ophthalmologist

I have always been fascinated by stories. During my biomedical sciences degree, though I enjoyed wet lab sessions, I was truly in my element when sitting down to write up my results and find the stories within the data. Working at Texere gives me the opportunity to delve into a plethora of interesting stories, sharing them with a wide audience as I go.

Register to The Ophthalmologist

Register to access our FREE online portfolio, request the magazine in print and manage your preferences.

You will benefit from:
  • Unlimited access to ALL articles
  • News, interviews & opinions from leading industry experts
  • Receive print (and PDF) copies of The Ophthalmologist magazine

Register

Disclaimer

The Ophthalmologist website is intended solely for the eyes of healthcare professionals. Please confirm below: