Customer is experiencing a duplicate key exception when attempting to execute MongoSpark.save(RDD, writeConfig) - there is a workaround that involves manually executing the operations. It may be possible to alter the function to check for an _id and honor the *replaceDocument *flag similar to save(dataset)
- duplicates
-
SPARK-279 Duplicate key exception when using Spark Connector save with RDD
- Closed