z-logo
open-access-imgOpen Access
A Novel Technique to Remove Duplicacy from Cloud
Author(s) -
Neha Verma
Publication year - 2021
Publication title -
international journal for research in applied science and engineering technology
Language(s) - English
Resource type - Journals
ISSN - 2321-9653
DOI - 10.22214/ijraset.2021.35861
Subject(s) - cloud computing , upload , computer science , database , cloud storage , hash function , computer data storage , computer security , operating system
Cloud Computing is one of the most emerging technology, which helped several organizations to save time, resources and money by adding convenience to the end users. This project deals with removing duplicate data on cloud to save storage. The continuous development of the information technology and the requirement to make the data available for anytime and anywhere purpose, makes it necessary to remove duplicate files form the storage area to maximise storage. After uploading a particular file, the system crosschecks the data, and it will keep it inside the user's bucket on cloud if the data is new. This proposed method is more secure and utilizes less resources of cloud. Otherwise, the system will not keep the data if it’s already there. The main concept of this technology is to reduce duplicate data as much as possible using Hashing technique.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here