Presentation + Paper
6 April 2023 Quintet margin loss for an improved knowledge distillation in histopathology image analysis
Author Affiliations +
Abstract
Digital and computational pathology tools often suffer from a lack of relevant data. Although more and more data centers publicize the datasets, high-quality ground truth annotations may not be available in a timely manner. Herein, we propose a knowledge distillation framework that can utilize a teacher network that is already trained on a relatively larger amount of data and achieve accurate and robust performance on histopathology images by a student network. For an effective and efficient knowledge distillation, we introduce a quintet margin loss that pushes the student network not only to mimic the knowledge representation of the teacher network but also to outperform the teacher network on a target domain. We systematically evaluated the proposed approach. The results show that the proposed approach outperforms other competing models with and without different types of knowledge distillation methods.
Conference Presentation
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Trinh T. L. Vuong and Jin Tae Kwak "Quintet margin loss for an improved knowledge distillation in histopathology image analysis", Proc. SPIE 12471, Medical Imaging 2023: Digital and Computational Pathology, 124710J (6 April 2023); https://doi.org/10.1117/12.2654275
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Education and training

Histopathology

Data modeling

Image analysis

Machine learning

Adversarial training

Convolution

Back to Top