Ignoring non-spark config property: hive.exec.dynamic.partition.mode
Asked Answered
N

2

12

How to run a Spark-shell with hive.exec.dynamic.partition.mode=nonstrict?

I try (as suggested here)

  export SPARK_MAJOR_VERSION=2; spark-shell  --conf "hive.exec.dynamic.partition.mode=nonstrict" --properties-file /opt/_myPath_/sparkShell.conf'

but Warning "Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict"


PS: using Spark version 2.2.0.2.6.4.0-91, Scala version 2.11.8

NOTE

The demand arrives after error on df.write.mode("overwrite").insertInto("db.partitionedTable"),

org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict

Nazarene answered 30/10, 2019 at 21:15 Comment(0)
D
12

You can try using spark.hadoop.* prefix as suggested in Custom Spark Configuration section for version 2.3. Might work as well in 2.2 if it was just a doc bug :)

spark-shell \
  --conf "spark.hadoop.hive.exec.dynamic.partition=true" \
  --conf "spark.hadoop.hive.exec.dynamic.partition.mode=nonstrict" \
  ...
Dysphasia answered 30/10, 2019 at 23:12 Comment(0)
B
6

I had the same problem and only found the workaround to set the config directly in the process before writing like

spark.conf.set("hive.exec.dynamic.partition.mode", "nonstrict")
df.write(...)
Berneicebernelle answered 30/10, 2019 at 22:7 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.