Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-130

spark connector may need update/replace/delete function when save data

    • Type: Icon: Improvement Improvement
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: 2.2.0
    • Component/s: API

      In our project, we need to update the data stored in MongoDB. At first I used mongo-hadoop-core jar, but met an exception of "the directory item limit ", and I need to increase the limit for the cluster. So I then change to mongo-spark connnector, everything gone well before I wanted to update the data, it dosen't support RDD data update function, so I implement the function in save method according to the dataframe update function. So I think it's necessary for RDD to store/update/replace/delete function according to the MongoDB's insert/update/replace/delete for anyone who will use it.

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            hevensun Davy Song
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated:
              Resolved: