首页 > 代码库 > hadoop+hive+spark搭建(二)
hadoop+hive+spark搭建(二)
上传hive软件包到任意节点
一、安装hive软件
解压缩hive软件包到/usr/local/hadoop/目录下
重命名hive文件夹
在/etc/profile文件中添加环境变量
export HIVE_HOME=/usr/local/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH
运行命令source /etc/profile
使用mysql作为数据库时需要安装mysql
在mysql中创建hive用户,数据库等
退出mysql
拷贝mysql-connector-java.jar到hive目录下lib/中
二、修改配置文件
修改hive目录中conf/hive-default.xml.template文件为conf/hive-site.xml
在conf目录中修改配置文件hive-site.xml
(使用默认数据库)
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hive/${system:user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/home/hive/${hive.session.id}_resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/home/hivee/${system:user.name}</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hive/${system:user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hive/${system:user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
<property>
<name>hive.downloaded.resources.dir</name>
<value>/home/hive/${hive.session.id}_resources</value>
<description>Temporary local directory for added resources in the remote file system.</description>
</property>
<property>
<name>hive.querylog.location</name>
<value>/home/hivee/${system:user.name}</value>
<description>Location of Hive run time structured log file</description>
</property>
<property>
<name>hive.exec.local.scratchdir</name>
<value>/home/hive/${system:user.name}</value>
<description>Local scratch space for Hive jobs</description>
</property>
三、运行hive
输入命令格式化数据库
默认数据库 schematool -initSchema -dbType derby
mysql数据库 schematool -initSchema -dbType mysql
启动hive
输入命令hive
hive安装完毕
hadoop+hive+spark搭建(二)