IJIGSP Vol. 15, No. 1, 8 Feb. 2023
Cover page and Table of Contents: PDF (size: 886KB)
Full Text (PDF, 886KB), PP.12-22
Views: 0 Downloads: 0
Breast Cancer Classification, Benign vs. Malignant Tumor, Breast Ultrasound, Deep Transfer Learning, VGG16 Model
Ultrasound based breast screening is gaining attention recently especially for dense breast. The technological advancement, cancer awareness, and cost-safety-availability benefits lead rapid rise of breast ultrasound market. The irregular shape, intensity variation, and additional blood vessels of malignant cancer are distinguishable in ultrasound images from the benign phase. However, classification of breast cancer using ultrasound images is a difficult process owing to speckle noise and complex textures of breast. In this paper, a breast cancer classification method is presented using VGG16 model based transfer learning approach. We have used median filter to despeckle the images. The layers for convolution process of the pretrained VGG16 model along with the maxpooling layers have been used as feature extractor and a proposed fully connected two layers deep neural network has been designed as classifier. Adam optimizer is used with learning rate of 0.001 and binary cross-entropy is chosen as the loss function for model optimization. Dropout of hidden layers is used to avoid overfitting. Breast Ultrasound images from two databases (total 897 images) have been combined to train, validate and test the performance and generalization strength of the classifier. Experimental results showed the training accuracy as 98.2% and testing accuracy as 91% for blind testing data with a reduced of computational complexity. Gradient class activation mapping (Grad-CAM) technique has been used to visualize and check the targeted regions localization effort at the final convolutional layer and found as noteworthy. The outcomes of this work might be useful for the clinical applications of breast cancer diagnosis.
A. B. M. Aowlad Hossain, Jannatul Kamrun Nisha, Fatematuj Johora, "Breast Cancer Classification from Ultrasound Images using VGG16 Model based Transfer Learning", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.15, No.1, pp. 12-22, 2023. DOI:10.5815/ijigsp.2023.01.02
[1]R. L. Siegel, K. D. Miller, and A. Jemal, “Cancer statistics, 2020,” CA: A Cancer Journal for Clinicians, vol. 70, no. 1, pp. 7-30, 2020.
[2]E. K. Arleo, R. E. Hendrick, M. A. Helvie, and E. A. Sickles, “Comparison of recommendations for screening mammography using CISNET models,” Cancer, vol. 123, pp. 3673–3680, 2017.
[3]M. L. Giger, N. Karssemeijer, and J. A. Schnabel, “Breast image analysis for risk assessment, detection, diagnosis, and treatment of cancer,” Annual Review of Biomedical Engineering, vol. 15, pp. 327–357, 2002.
[4]H. D. Cheng, J, Shan, W. Ju, Y. Guo, and L. Zhang, “Automated breast cancer detection and classification using ultrasound images: a survey,” Pattern Recognition, vol. 43, no. 1, pp. 299–317, 2010.
[5]E. Devoli-Disha, S. Manxhuka-K¨erliu, H. Ymeri, and A. Kutllovci, “Comparative accuracy of mammography and ultrasound in women with breast symptoms according to age and breast density,” Bosnian Journal of Basic Medical Sciences, vol. 9, no. 2, pp. 131–136, 2009.
[6]B. Sahiner, H.-P. Chen, M. A. Roubidoux, L. A. Hadjiiski, M. A. Helvie, et al., “Computer-aided diagnosis of malignant and benign breast masses in 3D ultrasound volumes: Effect on radiologists’ accuracy,” Radiology, vol. 242, no. 3, pp. 716–724, 2007.
[7]C. R. Merritt, “Technology update,” Radiologic Clinics of North America, vol. 39, pp. 385-397, 2001.
[8]Marketwatch, Automated Breast Ultrasound System Market Share, Upcoming Trends, Size, Key Segments, Growth Status and Forecast 2027, Retrieved from https://www.marketwatch.com on March 23, 2022.
[9]M. H. Yap, E. Edirisinghe, and H. Bez, “Processed images in human perception: A case study in ultrasound breast imaging,” European Journal of Radiology, vol. 73, no. 3, pp. 682–687, 2010.
[10]J. Yanase and E. Triantaphyllou, “A systematic survey of computer-aided diagnosis in medicine: Past and present developments,” Expert Systems with Applications, vol. 138, pp. 112-821, 2019.
[11]H.D. Cheng, J. Shan, W. Ju, Y. Guo, L. Zhang, “Automated breast cancer detection and classification using ultrasound images: A survey,Pattern Recognition,” vol. 43, pp. 299-317, 2010.
[12]C. D. L. Nascimento et. al., "Breast tumor classification in ultrasound images using support vector machines and neural networks," Research on Biomedical Engineering , vol. 32, no. 03, 2016.
[13]K. W. Wan, C. H. Wong, H. F. Ip, D. Fan, P. L. Yuen, H. Y. Fong, and M. Ying, “Evaluation of the performance of traditional machine learning algorithms, convolutional neural network and AutoML Vision in ultrasound breast lesions classification: A comparative study,” Quantitative Imaging in Medicine and Surgery, vol. 11, pp. 1381-1393, 2021.
[14]V. K. Singh, H. A. Rashwan, M. Abdel-Nasser, M.. M. K. Sarker, F. Akram, N. Pandey, S. Romani, and D. Puig, “ An efficient solution for breast tumor segmentation and classification in ultrasound images using deep adversarial learning,” Preprint arXiv:1907.00887, 2019.
[15]T. Xiao, L. Liu, K. Li, W. Qin, S. Yu, and Z. Li, “Comparison of transferred deep neural networks in ultrasonic breast masses discrimination,” BioMed Research International, vol. 9, 2018.
[16]M. D. Barber, Breast Cancer: An Atlas of Investigation and Management, Clinical Publishing, 2008.
[17]N. Harbeck et al., “Breast cancer,” Nature Reviews Disease Primers, vol. 5, no. 1, 2019.
[18]E. A. Rakha at el., “Morphological and immunophenotypic analysis of breast carcinomas with basal and myoepithelial differentiation,” The Journal of Pathology, vol. 208, no. 4, pp. 495-506, 2006.
[19]P. S. Rodrigues, “Breast ultrasound image,” Mendeley Data, V1, 2017.
[20]W. Al-Dhabyani, M. Gomaa, H. Khaled, and A. Fahmy, “Dataset of breast ultrasound images.” Data Brief, vol. 28, no. 104863, 2019.
[21]J. A. Rapelyea and C. G. Marks, Breast Imaging: Breast Ultrasound Past, Present, and Future. Intech Open, 2017.
[22]A. T. Stavros, D. Thickman, C. L. Rapp, M. A. Dennis, S. H. Parker, and G. A. Sisney, “Solid breast nodules: use of sonography to distinguish between benign and malignant lesions,” Radiology, vol. 196, pp. 123-134, 1995.
[23]A. T. Stavros, “Benign Solid Nodules: Specific pathologic diagnosis,” Breast Ultrasound, vol. 13, pp. 528–596, 2004.
[24]K. Simonyan and A. Zisserman, “Very Deep Convolutional Networks for Large-Scale Image Recognition,” Proceedings of the third International Conference on Learning Representations, 1-14, 2015.
[25]A. Krizhevsky I. Sutskever and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” Advances in Neural Information Processing Systems, vol. 25, pp.1106-1114, 2012.
[26]A. Tharwat, “Classification assessment methods,” Applied Computing and Informatics, vol. 17, no. 1, pp. 168-192, 2021.
[27]K. H. Zou, J. O’Malley, and L. Mauri, “Receiver-operating characteristic analysis for evaluating diagnostic tests and predictive models,” Circulation, vol.115, no. 5, pp. 654-657, 2007.
[28]R. R. Selvaraju, M. Cogswell, A. Das, R. Vedantam, D. Parikh, and D. Batra, “Grad-CAM: Visual explanations from deep networks via gradient-based localization,” International Journal of Computer Vision, vol. 128, no. 2, pp. 336-359, 2020.