Artificial Intelligence in Pathology: Challenges and Considerations
by Chad Salinas, Vice President, AI and Machine Learning
Artificial Intelligence is exponentially increasing its impact on healthcare. As deep learning is mastering computer vision tasks, its application to digital pathology is natural, with the promise of aiding in routine reporting and standardizing results across trials. 1
Physicians recognize the potential of AI in pathology, with one global survey documenting nearly 75% of respondents reporting interest or excitement in AI as a diagnostic tool to facilitate improvements in workflow efficiency and quality assurance in pathology.2 This finding is, in part, a recognition of the increasing complexity associated with practicing pathology: complicated workflows3 coupled with global pathology workforce shortages4,5 impact pathologist workloads.6 These factors -- individually and in concert -- create opportunities that AI is well-suited to address, such as:
a) distinction of benign and tumor
b) grading of dysplasia and in situ lesions
c) evidence and extent of invasion
d) identification of micrometastases in lymph node resections
e) IHC/ISH scoring of multiple biomarkers and topography of the immune response
f) percentage of tumor and overall cellular content
g) extracting new patterns from the digital images and clinical correlates (next generation morphology)
h) automated management and prioritization of pathology workflow7
For more than 150 years, Leica Biosystems has worked with the pathology community to introduce innovations that span the workflow, from biopsy through diagnosis. As with all we do at Leica Biosystems, we are guided by our mission: Advancing Cancer
Diagnostics, Improving Lives. Now, we are accelerating our work to realize the potential of AI in pathology.
I have the privilege of leading our growing AI team. While it is early in our endeavor, I’m energized by our existing partnerships with laboratory professionals, pathologists and healthcare organizations, and our progress with like-minded leaders including Paige AI.
I’m even more excited by what's yet to come.
As the pathology ecosystem moves forward bringing AI into clinical practice, it is vital we do so collectively. It is in that spirit that I share some thoughts on four areas of consideration when evaluating how AI may become a valuable aid to the pathologist: Empowerment, Ethics, Ecosystem, and Transparency.
Consideration #1: Empowerment
AI should be technology helping humans, not replacing them.
AI technology should be designed to enhance and extend the capabilities and potential of pathology professionals, not replace them. Whether a laboratory professional or a pathologist, AI can help ensure they have the information needed to make the best decisions possible.
There is potential for AI tools to increase efficiencies in clinical workflow by reducing manual diagnostic steps for the pathologist through the automation of certain types of human tasks. These tasks can then be performed at high levels of speed and reproducibility, thus augmenting and freeing expert personnel to focus on tasks that require human judgment. For example, a pathologist might visually determine a microscopic region of interest on a slide (e.g., focus of invasive carcinoma), and an AI application could compute both the area and the number of particular features within that region (e.g., number of mitoses per mm2).8 In doing so, AI has the potential to free up time, enabling the pathologist to focus on complex cases that require their expertise.
Further, I believe AI has the potential to benefit ALL of us in the pathology ecosystem and that the benefits of the AI era should touch the many, not just the elite few. The most sound approaches are the ones that ensure deployment considers the realities and needs of a stand-alone independent laboratory environment as much as laboratories that are embedded in large health systems.
Consideration #2: Ethics
Patient safety and privacy are critical considerations when developing AI tools.
How will AI prioritize patient safety and privacy? How will AI address the possibility of data-driven bias? One initial step for consideration is training AI tools on de-identified, stratified data sets. For example, Leica Biosystems is currently working with the UK National Pathology Imaging Co-operative (NPIC) to develop and validate AI tools for the assessment of breast cancer, with an ultimate goal of deploying these as part of the routine diagnostic pathway.
The importance of weighing risks and benefits is a matter of significant discussion in the pathology profession. For example, Jackson et al discuss a number of considerations in their recent paper The Ethics of Artificial Intelligence in Pathology and Laboratory Medicine: Principles and Practice, such as “the ethical development, validation, and implementation of medical AI applications in pathology and laboratory medicine.”9 The Digital Pathology Association10 also discusses patient considerations related to successful adoption of AI for pathology.
Consideration #3: Ecosystem
AI requires collaboration with & beyond the ‘traditional’ pathology community.
Next, as one considers potential applications of AI in routine pathology practice, one will learn that AI cannot exist in isolation. Rather, it is necessary that AI tools integrate into the already robust pathology workflow, follow implementation standards, and are practical to use.
The generation of a successful AI workflow for meeting one's needs ultimately depends on seamless interoperability between AI, digital pathology, and laboratory information
systems.11 To that end, AI offerings should leverage a cross-section of experts — research, clinical, workflows, data science, software, hardware and more — all working in close collaboration. Extensive testing and validation of AI offerings need to be conducted prior to commercial use to ensure the offerings meet expectations and achieve intended outcomes.
It’s also vital that AI platforms are built within the context of the problem they are solving and in collaboration with industry experts. Both cognitive systems and end users must be trained together as part of a symbiotic relationship. No one technology or person can or should do it alone. Companies must be prepared to invest in training users as well as training the system itself.
Consideration #4: Transparency
Demystify AI to help build trust.
Explanation in AI systems is considered to be critical across all areas where machine learning is used.12 Specifically, incorporating Explainable Artificial Intelligence (XAI) into product development may be one way to begin approaching a level of transparency around artificial intelligence. XAI is a field that is concerned with the development of new methods that explain and interpret machine learning models, [a field that] has been tremendously reignited over recent years.13
Further, we need to be conscious of the evolution of the regulatory environment around transparency in the regulation of AI, which is in early development and is of international interest.14 Governmental bodies including the U.S. Food & Drug Administration, European Commission and organizations including the Center for Data Innovation have only recently started publishing guidance as doing so requires robust discussion of numerous, complex topics ranging from testing and validation to ongoing monitoring and expanded usability.15 Visibility into the conversations regarding regulation and,
ultimately, regulatory decisions regarding AI will be essential as we as an ecosystem ‘walk’ through this untrodden territory and, together, learn as we progress. At Leica Biosystems, we’re pleased to be participating in working groups on this topic and will continue to engage in the evolving dialogue and share our perspectives.
Continuing the Conversation
What’s perhaps most promising is the power that AI for pathology has to improve so many parts of the healthcare ecosystem. From oncology and genomics, the pathology community must focus on converging data and insights from across the healthcare ecosystem to support the larger role in improving healthcare together. At the end of the day, the biggest opportunities will come from data and be powered by collaboration.
- Bizzego A, Bussola N, Chierici M, Maggio V, Francescatto M, Cima L, et al. (2019) Evaluating reproducibility of AI algorithms in digital pathology with DAPPER. PLoS Comput Biol 15(3): e1006269. doi.org/10.1371/journal.pcbi.1006269.
- Sarwar, S., Dent, A., Faust, K. et al. Physician perspectives on integration of artificial intelligence into diagnostic pathology. npj Digit. Med. 2, 28 (2019). doi.org/10.1038/s41746-019-0106-0
- Magliocco, Anthony, MD for Leica Biosystems Knowledge Pathway. Anatomic Pathology's Quality Journey. Webinar housed on Leica Biosystems website. Accessed 22 June 2021.
- Cancer Research UK. Testing times to come? An evaluation of pathology capacity across the UK. 2016. Accessed 11 March 2020.
- Robboy SJ, Weintraub S, Horvath AE, et al. Pathologist workforce in the United States: I. Development of a predictive model to examine factors influencing supply. Arch Pathol Lab Med. 2013;137:1723–1732.
- Williams BJ. The Future of Pathology Expert Report: Pathologist and Machine: The Perfect Partnership. 23. Accessed 18 June 2021.
- Salto-Tellez, M., Maxwell, P., & Hamilton, P. W. (2018). Artificial Intelligence - The Third Revolution in Pathology. Histopathology. https://doi.org/10.1111/his.13760. 5-6. Accessed 18 June 2021.
- Jackson BR, Ye Y, Crawford JM, et al. The Ethics of Artificial Intelligence in Pathology and Laboratory Medicine: Principles and Practice. Academic Pathology. January 2021. doi:10.1177/2374289521990784
- Abels E, Pantanowitz L, Aeffner F, Zarella MD, van der Laak J, Bui MM, Vemuri VN, Parwani AV, Gibbs J, Agosto-Arroyo E, Beck AH, Kozlowski C. Computational pathology definitions, best practices, and recommendations for regulatory guidance: a white paper from the Digital Pathology Association. J Pathol. 2019 Nov;249(3):286-294. doi: 10.1002/path.5331. Epub 2019 Sep 3. PMID: 31355445; PMCID: PMC6852275.
- Cheng JY, Abel JT, Balis UGJ, McClintock DS, Pantanowitz L. Challenges in the Development, Deployment, and Regulation of Artificial Intelligence in Anatomic Pathology. Am J Pathol. 2020 Nov 24:S0002-9440(20)30508-3. doi: 10.1016/j.ajpath.2020.10.018. Epub ahead of print. PMID: 33245914.
- Randy Goebel, Ajay Chander, Katharina Holzinger, Freddy Lecue, Zeynep Akata, et al.. Explainable AI: the new 42?. 2nd International Cross-Domain Conference for Machine Learning and Knowledge Extraction (CD-MAKE), Aug 2018, Hamburg, Germany. pp.295-303, ff10.1007/978-3-319-99740- 7_21ff. ffhal-01934928
- James H. Harrison, Jr, MD, PhD; John R. Gilbertson, MD; Matthew G. Hanna, MD; Niels H. Olson, MD; Jansen N. Seheult, MB, Bch, BAO, MSc, MD;; James M. Sorace, MD, MS; Michelle N. Stram, MD, MSc Introduction to Artificial Intelligence and Machine Learning for Pathology Arch Pathol Lab Med (2021) doi.org/10.5858/arpa.2020-0541-CP