Work place: Vice President, Citigroup NA in Tampa, FL,USA
E-mail: Shamsher1680@gmail.com
Website: https://orcid.org/0009-0007-6784-9710
Research Interests:
Biography
Shamsher Ali received his Master in Computer Sciences and its applications (MCA) from the Aligarh Muslim University, Aligarh, India in 2001. He has been working for various finacial and Information Technology organizations like IBM, Merrill Lynch(now Bank of America), Larsen and Toubro Infotech, Citigroup NA. He is currently working for Citigroup NA in Tampa, FL as Vice President. His work and interest are mainly databases and its optimization and scalability.
By Ranga Kavitha Mahaboob Sharief Shaik Narala Swarnalatha M. Pujitha Syed Asadullah Hussaini Samiullah Khan Shamsher Ali
DOI: https://doi.org/10.5815/ijieeb.2025.02.07, Pub. Date: 8 Apr. 2025
Effective storage management is crucial for cloud computing systems' speed and cost, given data's exponential increase. The significance of this issue has increased as the amount of data continues to increase at a disturbing pace. The act of detecting and removing duplicate data can enhance storage utilisation and system efficiency. Using less storage capacity reduces data transmission costs and enhances cloud infrastructure scalability. The use of deduplication techniques on a wide scale, on the other hand, presents a number of important obstacles. Security issues, delays in deduplication, and maintaining data integrity are all examples of difficulties that fall under this classification. This paper introduces a revolutionary method called Data Deduplication-based Efficient Cloud Optimisation Technique (DD-ECOT). Optimising storage processes and enhancing performance in cloud-based systems is its intended goal. DD-ECOT combines advanced pattern recognition with chunking to increase storage efficiency at minimal cost. It protects data during deduplication with secure hash-based indexing. Parallel processing and scalable design decrease latency, making it adaptable enough for vast, ever-changing cloud setups.The DD-ECOT system avoids these problems through employing a secure hash-based indexing method to keep data intact and by using parallel processing to speed up deduplication without impacting system performance. Enterprise cloud storage systems, disaster recovery solutions, and large-scale data management environments are some of the usage cases for DD-ECOT. Analysis of simulations shows that the suggested solution outperforms conventional deduplication techniques in terms of storage efficiency, data retrieval speed, and overall system performance. The findings suggest that DD-ECOT has the ability to improve cloud service delivery while cutting operational costs. A simulation reveals that the proposed DD-ECOT framework outperforms existing deduplication methods. DD-ECOT boosts storage efficiency by 92.8% by reducing duplicate data. It reduces latency by 97.2% using parallel processing and sophisticated deduplication. Additionally, secure hash-based indexing methods improve data integrity to 98.1%. Optimized bandwidth usage of 95.7% makes data transfer efficient. These improvements suggest DD-ECOT may save operational costs, optimize storage, and beat current deduplication methods.
[...] Read more.Subscribe to receive issue release notifications and newsletters from MECS Press journals