首页 > 代码库 > Spark2.1.0单机模式无法启动master的问题
Spark2.1.0单机模式无法启动master的问题
运行start-master.sh后,日志报错如下:
starting org.apache.spark.deploy.master.Master, logging to /home/hadoop/spark-2.1.0-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop1.out [root@hadoop1 sbin]# cat /home/hadoop/spark-2.1.0-bin-hadoop2.7/logs/spark-hadoop-org.apache.spark.deploy.master.Master-1-hadoop1.out Spark Command: /home/hadoop/hadoop/jdk1.8.0_101/bin/java -cp /home/hadoop/spark-2.1.0-bin-hadoop2.7/conf/:/home/hadoop/spark-2.1.0-bin-hadoop2.7/jars/* -Xmx1g org.apache.spark.deploy.master.Master --host hadoop1 --port 7077 --webui-port 8080 ======================================== Using Spark‘s default log4j profile: org/apache/spark/log4j-defaults.properties 17/03/04 21:09:01 INFO Master: Started daemon with process name: 14373@hadoop1 17/03/04 21:09:01 INFO SignalUtils: Registered signal handler for TERM 17/03/04 21:09:01 INFO SignalUtils: Registered signal handler for HUP 17/03/04 21:09:01 INFO SignalUtils: Registered signal handler for INT 17/03/04 21:09:01 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST 17/03/04 21:09:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/03/04 21:09:02 INFO SecurityManager: Changing view acls to: root 17/03/04 21:09:02 INFO SecurityManager: Changing modify acls to: root 17/03/04 21:09:02 INFO SecurityManager: Changing view acls groups to: 17/03/04 21:09:02 INFO SecurityManager: Changing modify acls groups to: 17/03/04 21:09:02 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7077. Attempting port 7078. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7078. Attempting port 7079. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7079. Attempting port 7080. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7080. Attempting port 7081. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7081. Attempting port 7082. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7082. Attempting port 7083. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7083. Attempting port 7084. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7084. Attempting port 7085. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7085. Attempting port 7086. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7086. Attempting port 7087. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7087. Attempting port 7088. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7088. Attempting port 7089. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7089. Attempting port 7090. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7090. Attempting port 7091. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7091. Attempting port 7092. 17/03/04 21:09:02 WARN Utils: Service ‘sparkMaster‘ could not bind on port 7092. Attempting port 7093. Exception in thread "main" java.net.BindException: 无法指定被请求的地址: Service ‘sparkMaster‘ failed after 16 retries (starting from 7077)! Consider explicitly setting the appropriate port for the service ‘sparkMaster‘ (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries. at sun.nio.ch.Net.bind0(Native Method) at sun.nio.ch.Net.bind(Net.java:433) at sun.nio.ch.Net.bind(Net.java:425) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127) at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501) at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218) at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:506) at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:491) at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965) at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210) at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:408) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:455) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) at java.lang.Thread.run(Thread.java:745)
解决办法:
在spark-env.sh中配置:
export SPARK_MASTER_HOST=127.0.0.1
export SPARK_LOCAL_IP=127.0.0.1
再次运行启动脚本即可。
Spark2.1.0单机模式无法启动master的问题
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。