首页 > 代码库 > hadoop-2.2.0编译

hadoop-2.2.0编译

由于从官网下载的hadoop中依赖包 native hadoop library是基于32位版本的,在64位机器上安装,会出现以下错误:

2014-05-30 19:47:49,703 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: registered UNIX signal handlers for [TERM, HUP, INT]
2014-05-30 19:47:49,887 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/data2/dfs-data should be specified as a URI in configuration files. Please update hdfs configuration.
2014-05-30 19:47:49,888 WARN org.apache.hadoop.hdfs.server.common.Util: Path /home/data3/dfs-data should be specified as a URI in configuration files. Please update hdfs configuration.
2014-05-30 19:47:50,144 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2014-05-30 19:47:50,412 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2014-05-30 19:47:50,482 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2014-05-30 19:47:50,482 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: DataNode metrics system started
2014-05-30 19:47:50,485 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: File descriptor passing is disabled because libhadoop cannot be loaded.
2014-05-30 19:47:50,486 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Configured hostname is nobida122
2014-05-30 19:47:50,512 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened streaming server at /0.0.0.0:50010
2014-05-30 19:47:50,516 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Balancing bandwith is 1048576 bytes/s
2014-05-30 19:47:50,517 FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
java.lang.RuntimeException: Although a UNIX domain socket path is configured as /home/data3/dn_socket, we cannot start a localDataXceiverServer because libhadoop cannot be loaded.
at org.apache.hadoop.hdfs.server.datanode.DataNode.getDomainPeerServer(DataNode.java:520)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initDataXceiver(DataNode.java:492)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:663)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:259)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1727)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1642)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1665)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:1837)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1858)
2014-05-30 19:47:50,520 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
2014-05-30 19:47:50,523 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at nobida122/10.60.1.122

需要重新编译hadoop(http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html)。

编译hadoop-2.2.0(centos6.4上):

1、 首先安装依赖:

(1) jdk

(2) yum install g++ autoconf automake libtool cmake zlib1g-dev pkg-config

(3) protbuf-2.5(https://code.google.com/p/protobuf/downloads/list):下载解压后,依次运行 ./configure --prefix=/usr/local         make         make install

2、然后利用命令mvn package -Pdist,native -DskipTests -Dtar 进行编译,

3、编译有可能出现以下错误:

[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4:34.649s
[INFO] Finished at: Tue Jun 03 10:18:25 CST 2014
[INFO] Final Memory: 39M/584M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile (default-testCompile) on project hadoop-auth: Compilation failure: Compilation failure:
[ERROR] /opt/linshuai/tardir/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[88,11] error: cannot access AbstractLifeCycle
[ERROR] class file for org.mortbay.component.AbstractLifeCycle not found
[ERROR] /tardir/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[96,29] error: cannot access LifeCycle
[ERROR] class file for org.mortbay.component.LifeCycle not found
[ERROR] /tardir/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[98,10] error: cannot find symbol
[ERROR] symbol: method start()
[ERROR] location: variable server of type Server
[ERROR] /tardir/hadoop-2.2.0-src/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[104,12] error: cannot find symbol
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-auth

 

问题解决 https://issues.apache.org/jira/browse/HADOOP-10110:

Index: hadoop-common-project/hadoop-auth/pom.xml
===================================================================
--- hadoop-common-project/hadoop-auth/pom.xml	(revision 1543124)
+++ hadoop-common-project/hadoop-auth/pom.xml	(working copy)
@@ -54,6 +54,11 @@
     </dependency>
     <dependency>
       <groupId>org.mortbay.jetty</groupId>
+      <artifactId>jetty-util</artifactId>
+      <scope>test</scope>
+    </dependency>
+    <dependency>
+      <groupId>org.mortbay.jetty</groupId>
       <artifactId>jetty</artifactId>
       <scope>test</scope>
     </dependency>

然后mvn package -Pdist -DskipTests -rf :hadoop-auth

这里-rf 是只编译失败的项目。

 

4、新生成的native lib库在:hadoop-dist/target/hadoop-2.2.0/lib/native,将其覆盖安装目录的lib/native目录即可。