首页 > 代码库 > Hadoop 2.2.0 在CentOS6.2 64位下的编译
Hadoop 2.2.0 在CentOS6.2 64位下的编译
最近在学习搭建Hadoop,直接从Apache官方网站直接下载最新版本Hadoop2.2,结果运行时发现提示 “libhadoop.so.1.0.0 which might have disabled stack guard” 的警告。Google了一下发现是因为 hadoop 2.2.0提供的是libhadoop.so库是32位的,而我们的机器是64位。 所以需要重新在64位的机器上编译hadoop。
1、安装JDK
下载JDK1.7的Linux 64位版本jdk-7u15-linux-x64.tar.gz,解压。下载地址http://www.oracle.com/technetwork/cn/java/javase/downloads/jdk7-downloads-1880260-zhs.html
1 tar -xvf jdk-7u15-linux-x64.tar.gz2 mv jdk-7u15-linux-x64/ /opt/jdk1.7
然后配置环境变量
vim /etc/profile
在最后添加
export JAVA_HOME=/opt/jdk1.7export PATH=$PATH:$JAVA_HOME/bin
保存退出,使配置生效
source /etc/profile
2、安装各种依赖包
yum install gcc gcc-c++ make cmake openssl-devel ncurses-devel
3、安装protoc 2.5.0
hadoop2.2.0编译需要protoc2.5.0的支持,所以还要下载protoc,下载地址:https://code.google.com/p/protobuf/downloads/list。不过google最近貌似被完全墙了,原因你懂的,传到百度网盘了,下载地址:http://pan.baidu.com/s/1rP1hW,开始安装
tar –xvf protobuf-2.5.0.tar.gzcd protobuf-2.5.0mkdir /opt/protoc./configure --prefix=/opt/protocmake && make install
4、安装Maven
下载Maven3.1.1,下载地址:http://mirror.bit.edu.cn/apache/maven/maven-3/3.1.1/binaries/apache-maven-3.1.1-bin.tar.gz
tar -xvf apache-maven-3.1.1-bin.tar.gzmv apache-maven-3.1.1 /opt/maven/
然后配置环境变量,在最后加入
export MAVEN_HOME=/opt/maven export PATH=$PATH:$MAVEN_HOME/bin
保存退出,使配置生效
source /etc/profile
验证配置是否成功:
mvn -versionApache Maven 3.1.1 (0728685237757ffbf44136acec0402957f723d9a; 2013-09-17 23:22:22+0800)Maven home: /opt/mavenJava version: 1.7.0_60, vendor: Oracle CorporationJava home: /opt/jdk1.7/jreDefault locale: zh_CN, platform encoding: UTF-8OS name: "linux", version: "2.6.32-220.el6.x86_64", arch: "amd64", family: "unix"
5、配置Maven镜像
由于maven国外服务器可能连不上,先给maven配置一下国内镜像
vi /opt/maven/conf/settings.xml
在<mirrors></mirros>里添加:
<mirror> <id>nexus-osc</id> <mirrorOf>*</mirrorOf> <name>Nexusosc</name> <url>http://maven.oschina.net/content/groups/public/</url></mirror>
在<profiles></profiles>内添加
<profile> <id>jdk-1.7</id> <activation> <jdk>1.7</jdk> </activation> <repositories> <repository> <id>nexus</id> <name>local private nexus</name> <url>http://maven.oschina.net/content/groups/public/</url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> </repositories> <pluginRepositories> <pluginRepository> <id>nexus</id> <name>local private nexus</name> <url>http://maven.oschina.net/content/groups/public/</url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </pluginRepository> </pluginRepositories></profile>
6、给Hadoop 打补丁
下载Hadoop2.2.0源码包,地址:http://archive.apache.org/dist/hadoop/core/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz
或者:http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.2.0/hadoop-2.2.0-src.tar.gz
解压:
tar –xvf hadoop-2.2.0-src.tar.gz
目前的2.2.0 的Source Code 压缩包解压出来的code有个bug 需要patch后才能编译。否则编译hadoop-auth 会提示下面错误:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile (default-testCompile) on project hadoop-auth: Compilation failure: Compilation failure:[ERROR] /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[84,13] cannot access org.mortbay.component.AbstractLifeCycle[ERROR] class file for org.mortbay.component.AbstractLifeCycle not found
下载补丁文件:https://issues.apache.org/jira/browse/HADOOP-10110
若不能下载,使用:http://pan.baidu.com/s/1kTkfAgZ
下载后放到目录hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/下,然后
cd hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/patch < HADOOP-10110.patch
7、编译Hadoop
进入hadoop源码包根目录进行编译
cd hadoop-2.2.0-srcmvn package -Pdist,native -DskipTests -Dtar
然后就是漫长的等待………………
若看到以下内容说明编译成功:
[INFO] ------------------------------------------------------------------------[INFO] Reactor Summary:[INFO] [INFO] Apache Hadoop Main ................................ SUCCESS [3.709s][INFO] Apache Hadoop Project POM ......................... SUCCESS [2.229s][INFO] Apache Hadoop Annotations ......................... SUCCESS [5.270s][INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.388s][INFO] Apache Hadoop Project Dist POM .................... SUCCESS [3.485s][INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [8.655s][INFO] Apache Hadoop Auth ................................ SUCCESS [7.782s][INFO] Apache Hadoop Auth Examples ....................... SUCCESS [5.731s][INFO] Apache Hadoop Common .............................. SUCCESS [1:52.476s][INFO] Apache Hadoop NFS ................................. SUCCESS [9.935s][INFO] Apache Hadoop Common Project ...................... SUCCESS [0.110s][INFO] Apache Hadoop HDFS ................................ SUCCESS [1:58.347s][INFO] Apache Hadoop HttpFS .............................. SUCCESS [26.915s][INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [17.002s][INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [5.292s][INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.073s][INFO] hadoop-yarn ....................................... SUCCESS [0.335s][INFO] hadoop-yarn-api ................................... SUCCESS [54.478s][INFO] hadoop-yarn-common ................................ SUCCESS [39.215s][INFO] hadoop-yarn-server ................................ SUCCESS [0.241s][INFO] hadoop-yarn-server-common ......................... SUCCESS [15.601s][INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [21.566s][INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [4.754s][INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [20.625s][INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.755s][INFO] hadoop-yarn-client ................................ SUCCESS [6.748s][INFO] hadoop-yarn-applications .......................... SUCCESS [0.155s][INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [4.661s][INFO] hadoop-mapreduce-client ........................... SUCCESS [0.160s][INFO] hadoop-mapreduce-client-core ...................... SUCCESS [36.090s][INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.753s][INFO] hadoop-yarn-site .................................. SUCCESS [0.151s][INFO] hadoop-yarn-project ............................... SUCCESS [4.771s][INFO] hadoop-mapreduce-client-common .................... SUCCESS [24.870s][INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [3.812s][INFO] hadoop-mapreduce-client-app ....................... SUCCESS [15.759s][INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [6.831s][INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [8.126s][INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [2.320s][INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [9.596s][INFO] hadoop-mapreduce .................................. SUCCESS [3.905s][INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [7.118s][INFO] Apache Hadoop Distributed Copy .................... SUCCESS [11.651s][INFO] Apache Hadoop Archives ............................ SUCCESS [2.671s][INFO] Apache Hadoop Rumen ............................... SUCCESS [10.038s][INFO] Apache Hadoop Gridmix ............................. SUCCESS [6.062s][INFO] Apache Hadoop Data Join ........................... SUCCESS [4.104s][INFO] Apache Hadoop Extras .............................. SUCCESS [4.210s][INFO] Apache Hadoop Pipes ............................... SUCCESS [9.419s][INFO] Apache Hadoop Tools Dist .......................... SUCCESS [2.306s][INFO] Apache Hadoop Tools ............................... SUCCESS [0.037s][INFO] Apache Hadoop Distribution ........................ SUCCESS [21.579s][INFO] Apache Hadoop Client .............................. SUCCESS [7.299s][INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [7.347s][INFO] ------------------------------------------------------------------------[INFO] BUILD SUCCESS[INFO] ------------------------------------------------------------------------[INFO] Total time: 11:53.144s[INFO] Finished at: Tue Jun 17 16:58:32 CST 2014[INFO] Final Memory: 70M/239M[INFO] ------------------------------------------------------------------------
编译后的Hadoop包的路径在:hadoop-2.2.0-src/hadoop-dist/target/ 下,至此Hadoop2.2.0在64位下编译完成。