Enhancing the Traditional File System to HDFS: A Big Data Solution
Author(s) -
Himani Saraswat,
Neeta Sharma,
Abhishek Rai
Publication year - 2017
Publication title -
international journal of computer applications
Language(s) - English
Resource type - Journals
ISSN - 0975-8887
DOI - 10.5120/ijca2017914367
Subject(s) - computer science , distributed file system , big data , file system , operating system , database , world wide web
We are in the twenty-first centuries also known as the digital era, where each and every thing generates a data whether it’s a mobile phone, signals, day to day purchasing and many more. This rapidly increases in amount of data; Big data has become a current and future frontier for researchers. In big data analysis, the computation is done on massive heap of data sets to extract intelligent, knowledgeable and meaningful data and at the same time the storage is also readily available to support the concurrent computation process. The Hadoop is designed to meet these complex but meaningful work. The HDFS (Hadoop Distributed File System) is highly fault-safe and is designed to be deployed on low cost hardware. This paper gives out the benefits of HDFS given to the large data set; HDFS architecture and its role in Hadoop.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom