首页 > 代码库 > Oozie_02安装遇到错误【20161116】

Oozie_02安装遇到错误【20161116】

【错误原因】hadoop的core-site.xml配置错误  把用户名hadoop配置成了主机名hadoop01

<!-- OOZIE -->
<property>
<name>hadoop.proxyuser.hadoop01.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hadoop01.groups</name>
<value>*</value>
</property>

 

【解决办法】改为用户名hadoop

<!-- OOZIE -->
<property>
<name>hadoop.proxyuser.hadoop.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hadoop.groups</name>
<value>*</value>
</property>

 【获取主机名、用户名、同组用户的方法】

[hadoop@hadoop01 hadoop-2.5.0-cdh5.3.6]$ hostname
hadoop01
[hadoop@hadoop01 hadoop-2.5.0-cdh5.3.6]$ whoami
hadoop
[hadoop@hadoop01 hadoop-2.5.0-cdh5.3.6]$ groups userName
groups: userName: No such user

【错误提示如下】:

[hadoop@hadoop01 oozie-4.0.0-cdh5.3.6]$ bin/oozie-setup.sh sharelib create -fs hdfs://hadoop01:8020 -locallib oozie-sharelib-4.0.0-cdh5.3.6-yarn.tar.gz
setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cdh-5.3.6/oozie-4.0.0-cdh5.3.6/libtools/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cdh-5.3.6/oozie-4.0.0-cdh5.3.6/libtools/slf4j-simple-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cdh-5.3.6/oozie-4.0.0-cdh5.3.6/libext/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
the destination path for sharelib is: /user/hadoop/share/lib/lib_20161115074321

Error: User: hadoop is not allowed to impersonate hadoop

Stack trace for the error was (for debug purposes):
--------------------------------------
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: hadoop is not allowed to impersonate hadoop
at org.apache.hadoop.ipc.Client.call(Client.java:1411)
at org.apache.hadoop.ipc.Client.call(Client.java:1364)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:744)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1912)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1089)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1085)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1085)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:496)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348)
at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1905)
at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1873)
at org.apache.oozie.tools.OozieSharelibCLI.run(OozieSharelibCLI.java:165)
at org.apache.oozie.tools.OozieSharelibCLI.main(OozieSharelibCLI.java:56)

Oozie_02安装遇到错误【20161116】