To reduce the dependence of lesion detection algorithms on annotated data and fully leverage the advantages of medical image big data from healthy people, this paper proposes an unsupervised breast lesion detection model based on a parallel convolutional neural network. It treats breast lesion detection in mammograms as a one-class classification problem. The network comprises two parallel convolutional neural network branches. One branch utilizes normal mammographic images as training samples to extract the deep features of normal images and considersthe compactness measure of features as the training loss. Simultaneously, to prevent the lack of inter-class discrimination in features extracted from the first branch's sample learning, the other branch introduces a subset of the ImageNet dataset as training samples and employs the descriptive measure of features as the training loss. This ensures that the extracted features not only satisfy intra-class similarity but also maintain inter-class discrimination. ResNet50 is chosen by comparison as the backbone network for the parallel branch and further improved. Experiments are conducted using the INbreast and BCDR datasets, and the results indicate that the proposed unsupervised lesion detection method could achieve comparable performance with the supervised methods without requiring any lesion annotation data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.