首页 > 代码库 > Spark之submit任务时的Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
Spark之submit任务时的Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
Spark submit任务到Spark集群时,会出现如下异常:
Exception 1:Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
查看Spark logs文件spark-Spark-org.apache.spark.deploy.master.Master-1-hadoop1.out发现:
此时的Spark Web UI界面如下:
Solution:
修改Spark配置文件spark-env.sh,将SPARK_LOCAL_IP的localhost修改为对应的主机名称即可
提交任务(WordCount)到Spark集群中,相应脚本如下:
执行脚本,运行Spark任务,过程如下:
./runSpark.sh
Spark之submit任务时的Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient memory
声明:以上内容来自用户投稿及互联网公开渠道收集整理发布,本网站不拥有所有权,未作人工编辑处理,也不承担相关法律责任,若内容有误或涉及侵权可进行投诉: 投诉/举报 工作人员会在5个工作日内联系你,一经查实,本站将立刻删除涉嫌侵权内容。