spark.yarn.executor.memoryOverhead
has now been deprecated:
WARN spark.SparkConf: The configuration key
'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark
2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead.
You can programmatically set spark.executor.memoryOverhead
by passing it as a config:
spark = (
SparkSession.builder
.master('yarn')
.appName('StackOverflow')
.config('spark.driver.memory', '35g')
.config('spark.executor.cores', 5)
.config('spark.executor.memory', '35g')
.config('spark.dynamicAllocation.enabled', True)
.config('spark.dynamicAllocation.maxExecutors', 25)
.config('spark.yarn.executor.memoryOverhead', '4096')
.getOrCreate()
)
sc = spark.sparkContext