Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-280

Enhance save(RDD) to avoid duplicate key exception

    • Type: Icon: New Feature New Feature
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • 2.4.3, 2.3.5, 2.2.9, 2.1.8, 3.0.1
    • Affects Version/s: None
    • Component/s: None
    • None

      Customer is experiencing a duplicate key exception when attempting to execute MongoSpark.save(RDD, writeConfig) - there is a workaround that involves manually executing the operations. It may be possible to alter the function to check for an _id and honor the *replaceDocument *flag similar to save(dataset)

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            steffan.mejia@mongodb.com Steffan Mejia
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated:
              Resolved: