z-logo
open-access-imgOpen Access
Sqoop usage in Hadoop Distributed File System and Observations to Handle Common Errors
Author(s) -
K. Uma Pavan Kumar,
S V N Srinivasu,
M N. Nachappa
Publication year - 2020
Publication title -
international journal of recent technology and engineering
Language(s) - English
Resource type - Journals
ISSN - 2277-3878
DOI - 10.35940/ijrte.d4980.119420
Subject(s) - computer science , distributed file system , process (computing) , installation , work (physics) , database , file system , operating system , relational database , engineering , mechanical engineering
The Hadoop framework provides a way of storing and processing the huge amounts of the data. The social media like Facebook, twitter and amazon uses Hadoop eco system tools so as to store the data in Hadoop distributed file system and to process the data Map Reduce (MR). The current work describes the usage of Sqoop in the process of import and export with HDFS. The work involves various possible import/export commands supported by the tool Sqoop in the eco system of Hadoop. The importance of the work is to highlight the common errors while installing Sqoop and working with Sqoop. Many developers and researchers were using Sqoop so as to perform the import/export process and to handle the source data in the relational format. In the current work the connectivity between mysql and sqoop were presented and various commands usage along with the results were presented. The outcome of the work is for each command the possible errors encountered and the corresponding solution is mentioned. The common configuration settings we have to follow so as to handle the Sqoop without any errors is also mentioned.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here