PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Digital and computational pathology tools often suffer from a lack of relevant data. Although more and more data centers publicize the datasets, high-quality ground truth annotations may not be available in a timely manner. Herein, we propose a knowledge distillation framework that can utilize a teacher network that is already trained on a relatively larger amount of data and achieve accurate and robust performance on histopathology images by a student network. For an effective and efficient knowledge distillation, we introduce a quintet margin loss that pushes the student network not only to mimic the knowledge representation of the teacher network but also to outperform the teacher network on a target domain. We systematically evaluated the proposed approach. The results show that the proposed approach outperforms other competing models with and without different types of knowledge distillation methods.
Trinh T. L. Vuong andJin Tae Kwak
"Quintet margin loss for an improved knowledge distillation in histopathology image analysis", Proc. SPIE 12471, Medical Imaging 2023: Digital and Computational Pathology, 124710J (6 April 2023); https://doi.org/10.1117/12.2654275
ACCESS THE FULL ARTICLE
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
The alert did not successfully save. Please try again later.
Trinh T. L. Vuong, Jin Tae Kwak, "Quintet margin loss for an improved knowledge distillation in histopathology image analysis," Proc. SPIE 12471, Medical Imaging 2023: Digital and Computational Pathology, 124710J (6 April 2023); https://doi.org/10.1117/12.2654275