Oral cancer is one of the deadliest diseases around the world with varied morphological traits, hence making it difficult to manually achieve accurate classification. Further, the traditional methods of diagnosis used by clinicians can be time-consuming and prone to error. Therefore, computer-assisted histopathological image classification is of extreme importance for the detection of oral cancer. We propose an image classification model known as External Attention Transformer model based on external attention mechanism, aiming to extract discriminating fine features from oral cancer tissue sections and their normal counterparts. We have used 4946 oral histopathological images classified into two categories: normal and oral squamous cell carcinoma (OSCC). Of the total images, 2435 of them are categorized as normal and 2511 as OSCC. External attention based deep neural network model attained 96.97% classification accuracy. Sensitivity and specificity were recorded as 97.61% and 96.41% respectively. It is found that the effectiveness of artificial intelligence methods for classifying oral cancer has significantly improved in comparison to leading edge methods, and this has a potential for early oral cancer detection.
|