首页 > 代码库 > 琐碎-hadoop2.2.0-hbase0.96.0-hive0.13.1整合

琐碎-hadoop2.2.0-hbase0.96.0-hive0.13.1整合


关于hadoop和hive、hbase的整合就不说了,这里就是在hadoop2.2.0的环境下整合hbase和hive

因为hive0.12不支持hadoop2,所以还要替换一些hadoop的jar包,但是从0.13开始就方便多了


环境:centos6.6-x64、jdk1.7、hadoop2.2.0、hbase0.96、hive0.13.1

1.hadoop启动、hbase的安装、MySQL的安装

。。。 。。。

2.hive的安装与配置

  1)解压

  2)把hbase中的一些包拷入hive lib目录下

  

  hbase-client-0.96.0-hadoop2.jar,

  hbase-common-0.96.0-hadoop2.jar

  hbase-common-0.96.0-hadoop2-tests.jar

  hbase-protocol-0.96.0-hadoop2.jar

  base-server-0.96.0-hadoop2.jar

  htrace-core-2.01.jar

  protobuf-java-2.5.0.jar

  guava-11.0.2.jar

  3)把MySQL的驱动包拷入hive lib目录下

  4)配置

  

  

 1 <?xml version="1.0"?> 2 <?xml-stylesheet type="text/xsl" href="http://www.mamicode.com/configuration.xsl"?> 3  4 <configuration> 5  6 <!-- Hive Execution Parameters --> 7  8 <property> 9   <name>javax.jdo.option.ConnectionURL</name>10   <value>jdbc:mysql://hadoop:3306/hive?createDatabaseIfNotExist=true</value>11   <description>JDBC connect string for a JDBC metastore</description>12 </property>13 14 <property>15   <name>javax.jdo.option.ConnectionDriverName</name>16   <value>com.mysql.jdbc.Driver</value>17   <description>Driver class name for a JDBC metastore</description>18 </property>19 20 <property>21   <name>javax.jdo.option.ConnectionUserName</name>22   <value>hive</value>23   <description>username to use against metastore database</description>24 </property>25 26 <property>27   <name>javax.jdo.option.ConnectionPassword</name>28   <value>hive</value>29   <description>password to use against metastore database</description>30 </property>31 32 <property>33     <name>hive.aux.jars.path</name>34     <value>file:///home/hadoop/apache-hive-0.13.1-bin/lib/hive-hbase-handler-0.13.1.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/hbase-client-0.96.0-hadoop2.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/hbase-common-0.96.0-hadoop2.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/hbase-common-0.96.0-hadoop2-tests.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/hbase-protocol-0.96.0-hadoop2.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/hbase-server-0.96.0-hadoop2.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/htrace-core-2.01.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/zookeeper-3.4.5.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/protobuf-java-2.5.0.jar,file:///home/hadoop/apache-hive-0.13.1-bin/lib/guava-11.0.2.jar</value>35 </property>36 37 </configuration>

  

(hive.aux.jars.path的value不要空格和换行)

  5)启动

  hive

2.测试

注意把MySQL中的操作数据库的编码设置成latin1(alter database hive character set latin1;)不然会报错。


 

 

琐碎-hadoop2.2.0-hbase0.96.0-hive0.13.1整合