Work place: Marri Laxman Reddy Institute of Technology and Management/ Computer Science and Engineering Department, Medchal Malkajgiri,500043, India
E-mail: swarnalatha.jogireddy@gmail.com
Website: https://orcid.org/0009-0006-7753-7828
Research Interests:
Biography
Narala Swarnalatha has been awarded M.Tech in Computer Science and Engineering at JNTUA. Has 10 years of teaching experience currently working as an Assistant professor in Computer science and engineering at Marri Laxman Reddy Institute of Technology and Management Hyderabad. Her interested areas are Artificial Intelligence and Machine Learning.
By Ranga Kavitha Mahaboob Sharief Shaik Narala Swarnalatha M. Pujitha Syed Asadullah Hussaini Samiullah Khan Shamsher Ali
DOI: https://doi.org/10.5815/ijieeb.2025.02.07, Pub. Date: 8 Apr. 2025
Effective storage management is crucial for cloud computing systems' speed and cost, given data's exponential increase. The significance of this issue has increased as the amount of data continues to increase at a disturbing pace. The act of detecting and removing duplicate data can enhance storage utilisation and system efficiency. Using less storage capacity reduces data transmission costs and enhances cloud infrastructure scalability. The use of deduplication techniques on a wide scale, on the other hand, presents a number of important obstacles. Security issues, delays in deduplication, and maintaining data integrity are all examples of difficulties that fall under this classification. This paper introduces a revolutionary method called Data Deduplication-based Efficient Cloud Optimisation Technique (DD-ECOT). Optimising storage processes and enhancing performance in cloud-based systems is its intended goal. DD-ECOT combines advanced pattern recognition with chunking to increase storage efficiency at minimal cost. It protects data during deduplication with secure hash-based indexing. Parallel processing and scalable design decrease latency, making it adaptable enough for vast, ever-changing cloud setups.The DD-ECOT system avoids these problems through employing a secure hash-based indexing method to keep data intact and by using parallel processing to speed up deduplication without impacting system performance. Enterprise cloud storage systems, disaster recovery solutions, and large-scale data management environments are some of the usage cases for DD-ECOT. Analysis of simulations shows that the suggested solution outperforms conventional deduplication techniques in terms of storage efficiency, data retrieval speed, and overall system performance. The findings suggest that DD-ECOT has the ability to improve cloud service delivery while cutting operational costs. A simulation reveals that the proposed DD-ECOT framework outperforms existing deduplication methods. DD-ECOT boosts storage efficiency by 92.8% by reducing duplicate data. It reduces latency by 97.2% using parallel processing and sophisticated deduplication. Additionally, secure hash-based indexing methods improve data integrity to 98.1%. Optimized bandwidth usage of 95.7% makes data transfer efficient. These improvements suggest DD-ECOT may save operational costs, optimize storage, and beat current deduplication methods.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals