Work place: VFSTR Deemed to be University, Guntur, Andhra Pradesh, India
E-mail: bhuvanachavala@gmail.com
Website:
Research Interests: Image Processing, Network Security, Network Architecture, Image Manipulation, Image Compression, Neural Networks, Computational Learning Theory, Data Structures and Algorithms
Biography
Ms. Chavala Bhuvaneswari is a MCA student in the department of information Technology at VFSTR Deemed to be University. Her professional activities have been focused in convolutional neural network, deep learning on image processing and image classification and identification.
By B. Premamayudu Chavala Bhuvaneswari
DOI: https://doi.org/10.5815/ijigsp.2022.05.07, Pub. Date: 8 Oct. 2022
Identification of COVID-19 may help the community and patient to prevent the disease containment and plan to attend disease in right time. Deep neural network models widely used to analyze the medical images of COVID-19 for automatic detection and give the decision support for radiologists to summarize the accurate remarks. This paper proposed deep transfer learning for chest CT scan images to detection and diagnosis of COVID-19. VGG19, InceptionRestNetV3, InceptionV3 and DenseNet201 neural network used for automatic detection of COVID-19 disease form CT scan images (SARS-CoV-2 CT scan Dataset). Four deep transfer learning models were developed, tested and compared. The main objective of this paper is to use pre-trained features and converge pre-trained features with targeted features to improve the classification accuracy. It is observed that DenseNet201 noted the best performance and the classification accuracy is 99.98% for 300 epochs. The findings of the experiments show that the deeper networks struggle to train adequately and give less consistency when there is limited data. The DenseNet201 model adopted for COVID-19 identification from lung CT scans has been intensively optimized with optimal hyper parameters and performs at noteworthy levels with precision 99.2%, recall 100%, specificity 99.2%, and F1 score 99.2%.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals