首页 > 代码库 > error when start pyspark

error when start pyspark

ERROR spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: Required executor memory (1024+384 MB) is above the max threshold (1024 MB) of this cluster! Please check the values of ‘yarn.scheduler.maximum-allocation-mb‘ and/or ‘yarn.nodemanager.resource.memory-mb‘.

 

调整

yarn.nodemanager.resource.memory-mb
yarn.scheduler.minimum-allocation-mb
yarn.scheduler.maximum-allocation-mb

error when start pyspark