Conexiant
Login
  • Corneal Physician
  • Glaucoma Physician
  • New Retinal Physician
  • Ophthalmology Management
  • Ophthalmic Professional
  • Presbyopia Physician
  • Retinal Physician
The Ophthalmologist
  • Explore

    Explore

    • Latest
    • Insights
    • Case Studies
    • Opinion & Personal Narratives
    • Research & Innovations
    • Product Profiles

    Featured Topics

    • Anterior Segment
    • Glaucoma
    • Retina

    Issues

    • Latest Issue
    • Archive
  • Subspecialties
    • Cataract
    • Cornea
    • Glaucoma
    • Neuro-ophthalmology
    • Oculoplastics
    • Pediatric
    • Retina
  • Business

    Business & Profession

    • Professional Development
    • Business and Entrepreneurship
    • Practice Management
    • Health Economics & Policy
  • Training & Education

    Career Development

    • Professional Development
    • Career Pathways

    Events

    • Webinars
    • Live Events
  • Events
    • Live Events
    • Webinars
  • Community

    People & Profiles

    • Power List
    • Voices in the Community
    • Authors & Contributors
  • Multimedia
    • Video
    • Podcasts
Subscribe
Subscribe

False

Advertisement
The Ophthalmologist / Issues / 2022 / Oct / See It, Say It, Change It
Health Economics and Policy Professional Development Business and Entrepreneurship

See It, Say It, Change It

If innovators are looking for far-reaching success, they must include data from all ethnicities

By Aleksandra Jones 10/7/2022 1 min read

Share

According to the gurus who shared their perspectives in this issue’s cover feature, system bias is a key challenge in the continuous development of AI in ophthalmology. As Michael Chiang, Director of the National Eye Institute at the NIH, describes it: “AI systems are typically trained and validated in fairly narrow populations and specific imaging devices, whereas real-world applications will need to be rigorously validated to ensure they work across broad populations and devices without bias.”

The same concern pops up time and time again. When I spoke with Anthony Khawaja, a Moorfields Eye Hospital glaucoma specialist and genomics expert, he was uncomfortable about making progress in medicine that would only benefit people from one ethnic background. “It seems clear that more work needs to be done to replicate prior research for other ethnic groups – and to develop a framework that leaves no group of patients disadvantaged” (1).

Although attempts are being made to deploy research projects in developing countries – for example, Zambian and UK ophthalmologists are using deep learning to screen for diabetic retinopathy (2) – most R&D is conducted in industrialized economies. Even in the most diverse populations, it is not always easy to gather data across different ethnicities, and, as Michael Abramoff told me previously (3): “There are legitimate concerns about racial and ethnic bias in AI. It is important that autonomous AI is designed and validated in a way that ensures those concerns are addressed, and any inappropriate bias is corrected.”

But that’s easier said than done when implicit bias is so prominent in society. Even electronic health records (EHR) have been perpetuating racial bias, as discovered by a group using machine learning techniques to analyze data of over 18,000 adult patients in Chicago (4). So, is there hope? The EHR project investigators found that the situation improved after March 1, 2020 – perhaps due to “social pressures [that] may have sensitized providers to racism and increased empathy for the experiences of racially minoritized communities.”

Visibility and awareness are key: you cannot change things you don’t, can’t, or won’t see. It is vital that we all shout from the rooftops about inequality – wherever we see it, including the sphere of ophthalmic innovation and research.

References

  1. A Jones, “Genomics and Glaucoma,” The Ophthalmologist (2019). Available at: https://bit.ly/3SLQoHU.
  2. V Bellemo et al., “Artificial intelligence using deep learning to screen for referable and vision-threatening diabetic retinopathy in Africa: a clinical validation study,” Lancet Digit Health, 1, e35 (2019). PMID: 33323239.
  3. A Jones, “The Ethics of AI,” The Ophthalmologist (2020). Available at: https://bit.ly/3UZT2vu.
  4. M Sun et al., “Negative patient descriptors: documenting racial bias in the Electronic Health Record,” 41, 203 (2022). PMID: 35044842.

About the Author(s)

Aleksandra Jones

Having edited several technical publications over the last decade, I crossed paths with quite a few of Texere's current team members, and I only ever heard them sing the company's praises. When an opportunity arose to join Texere, I jumped at the chance! With a background in literature, I love the company's ethos of producing genuinely engaging content, and the fact that it is so well received by our readers makes it even more rewarding.

More Articles by Aleksandra Jones

Related Content

Newsletters

Receive the latest Ophthalmology news, personalities, education, and career development – weekly to your inbox.

Newsletter Signup Image

False

Advertisement

False

Advertisement

Explore More in Ophthalmology

Dive deeper into the world of Ophthalmology. Explore the latest articles, case studies, expert insights, and groundbreaking research.

False

Advertisement
The Ophthalmologist
Subscribe

About

  • About Us
  • Work at Conexiant Europe
  • Terms and Conditions
  • Privacy Policy
  • Advertise With Us
  • Contact Us

Copyright © 2025 Texere Publishing Limited (trading as Conexiant), with registered number 08113419 whose registered office is at Booths No. 1, Booths Park, Chelford Road, Knutsford, England, WA16 8GS.

Disclaimer

The Ophthalmologist website is intended solely for the eyes of healthcare professionals. Please confirm below: