sqoop安装配置教程

下载并解压

1) 下载地址:http://mirrors.hust.edu.cn/apache/sqoop/1.4.6/

2) 上传安装包sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz到虚拟机中

3) 解压sqoop安装包到指定目录,如:

$ tar -zxf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C /opt/module/

修改配置文件

Sqoop的配置文件与大多数大数据框架类似,在sqoop根目录下的conf目录中。

1) 重命名配置文件

$ mv sqoop-env-template.sh sqoop-env.sh

2) 修改配置文件

sqoop-env.sh

export HADOOP_COMMON_HOME=/opt/module/hadoop-2.7.2

export HADOOP_MAPRED_HOME=/opt/module/hadoop-2.7.2

export HIVE_HOME=/opt/module/hive

export ZOOKEEPER_HOME=/opt/module/zookeeper-3.4.10

export ZOOCFGDIR=/opt/module/zookeeper-3.4.10

export HBASE_HOME=/opt/module/hbase

拷贝JDBC驱动

拷贝jdbc驱动到sqoop的lib目录下,如:

$ cp mysql-connector-java-5.1.27-bin.jar /opt/module/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/lib/

验证Sqoop

我们可以通过某一个command来验证sqoop配置是否正确:

$ bin/sqoop help

出现一些Warning警告(警告信息已省略),并伴随着帮助命令的输出:

Available commands:

  codegen            Generate code to interact with database records

  create-hive-table     Import a table definition into Hive

  eval               Evaluate a SQL statement and display the results

  export             Export an HDFS directory to a database table

  help               List available commands

  import             Import a table from a database to HDFS

  import-all-tables     Import tables from a database to HDFS

  import-mainframe    Import datasets from a mainframe server to HDFS

  job                Work with saved jobs

  list-databases        List available databases on a server

  list-tables           List available tables in a database

  merge              Merge results of incremental imports

  metastore           Run a standalone Sqoop metastore

  version            Display version information

3.5 测试Sqoop是否能够成功连接数据库

$ bin/sqoop list-databases --connect jdbc:mysql://hadoop102:3306/ --username root --password 000000

出现如下输出:

information_schema

metastore

mysql

oozie

performance_schema

posted @ 2022-10-15 23:42  好(justice)……  阅读(166)  评论(0编辑  收藏  举报