Abstract
Computational histopathology algorithms can interpret very large volumes of data, which can navigate pathologists to assess slides promptly, and also aid in the localization and quantification of abnormal cells or tissues. In recent years, taking place of conventional imaging processing methods, deep learning has become the mainstream methodology to interpret cancer pathology images. However, similar as conventional computer vision methods, stain normalization in tissue identification with convolutional neural networks (CNNs) is still essential for the diagnostic accuracy. Traditional prior knowledge-oriented color matching, as well as a particular style based pure learning in generative adversarial networks, may be encompassed with accuracy decrease when data centers are many. In this paper, we propose a novel color normalization method with a conditional generative adversarial network (cGAN). It is a learning-based interpolation approach with probability distribution space on multiple datasets training. A target template is designed to be label-dependent to overcome the improper color mapping problem caused by data heterogeneity. The tests are performed on histopathology datasets from The Cancer Genome Atlas (TCGA) and the proposed method outperforms other previous works in classification accuracy. This approach has potential in clinical practice for better recognition of cancer in digital pathology and can be implemented in a decentralized setting.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of Medical Imaging 2021: Digital Pathology, 15-19 February 2021, Online Only, United States |
| Publisher | SPIE |
| Number of pages | 7 |
| ISBN (Print) | 9781510640351 |
| DOIs | |
| Publication status | Published - 2021 |
| Event | Medical Imaging (Conference : SPIE) - Duration: 15 Feb 2021 → … |
Publication series
| Name | |
|---|---|
| ISSN (Print) | 1605-7422 |
Conference
| Conference | Medical Imaging (Conference : SPIE) |
|---|---|
| Period | 15/02/21 → … |
Bibliographical note
Publisher Copyright:© COPYRIGHT SPIE. Downloading of the abstract is permitted for personal use only.