// coalesce the data to a single file.
data.coalesce(1).write
Number of dynamic partitions created is 2100, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 2100.;
The below configuration must be set before starting the spark application
spark.hadoop.hive.exec.max.dynamic.partitions
A set with Spark SQL Server will not work. You need to set the configuration at the start of the server.from https://github.com/apache/spark/pull/18769