当前位置:  开发笔记 > 后端 > 正文

使用Sqoop实现Hive与MySQL数据库间数据迁移时报错-mysql教程

使用Sqoop实现Hive与MySQL数据库间数据迁移的时报错

使用Sqoop实现Hive与MySQL数据库间数据迁移的时报错

执行 ./sqoop create-hive-table --connect jdbc:mysql://192.168.1.10:3306/ekp_11 --table job_log --username root --password 123456 --hive-table job_log

准备将关系型数据的表结构复制到hive中。但是提示如下一堆错误信息:
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
15/08/02 02:04:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/08/02 02:04:14 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
15/08/02 02:04:14 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
15/08/02 02:04:14 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
15/08/02 02:04:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `job_log` AS t LIMIT 1
15/08/02 02:04:14 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `job_log` AS t LIMIT 1
15/08/02 02:04:14 WARN hive.TableDefWriter: Column fd_start_time had to be cast to a less precise type in Hive
15/08/02 02:04:14 WARN hive.TableDefWriter: Column fd_end_time had to be cast to a less precise type in Hive
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /cloud/Hadoop-2.2.0/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'.
15/08/02 02:04:16 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/08/02 02:04:17 INFO hive.HiveImport: Loading uploaded data into Hive
15/08/02 02:04:17 ERROR tool.CreateHiveTableTool: Encountered IOException running create table job: java.io.IOException: Cannot run program "hive": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1047)
at java.lang.Runtime.exec(Runtime.java:617)
at java.lang.Runtime.exec(Runtime.java:528)
at org.apache.sqoop.util.Executor.exec(Executor.java:76)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:382)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:335)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:239)
at org.apache.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:58)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.(UNIXProcess.java:186)
at java.lang.ProcessImpl.start(ProcessImpl.java:130)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1028)

... 13 more

惯性思维作祟,以为sqoop能智能到自己去找到本机的hive。

解决方案:为sqoop配置你使用的hive环境

具体步骤如下:
1、找到/sqoop-1.4.4/conf下的sqoop-env-template.sh 文件,将这个文件重命名为sqoop-env.sh ;
2、编辑sqoop-env.sh 文件,,将你的hive的安装目录配上就OK。

如:export HIVE_HOME=/cloud/apache-hive-1.2.1-bin

相关阅读:

通过Sqoop实现Mysql / Oracle 与HDFS / Hbase互导数据

[Hadoop] Sqoop安装过程详解

用Sqoop进行MySQL和HDFS系统间的数据互导

Hadoop Oozie学习笔记 Oozie不支持Sqoop问题解决

Hadoop生态系统搭建(hadoop hive hbase zookeeper oozie Sqoop)

Hadoop学习全程记录——使用Sqoop将MySQL中数据导入到Hive中

Sqoop 的详细介绍:请点这里
Sqoop 的下载地址:请点这里

本文永久更新链接地址

推荐阅读
Gbom2402851125
这个屌丝很懒,什么也没留下!
DevBox开发工具箱 | 专业的在线开发工具网站    京公网安备 11010802040832号  |  京ICP备19059560号-6
Copyright © 1998 - 2020 DevBox.CN. All Rights Reserved devBox.cn 开发工具箱 版权所有