Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-303

Support copy existing when using Spark streams

    • Type: Icon: New Feature New Feature
    • Resolution: Unresolved
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: Source, Stream
    • Needed
    • Hide

      1. What would you like to communicate to the user about this feature?
      Update the FAQ with instructions on how to perform the copy for existing data. Use the syntax example provided by Ross in the comments.

      3. Which versions of the driver/connector does this apply to?
      10.x

      Show
      1. What would you like to communicate to the user about this feature? Update the FAQ with instructions on how to perform the copy for existing data. Use the syntax example provided by Ross in the comments. 3. Which versions of the driver/connector does this apply to? 10.x

      Build a mechanism for ensuring all the data has been synced between the already existing source (typically MongoDB) and sink (typically data lake) and ensure it's done in a only once manner. 

            Assignee:
            Unassigned Unassigned
            Reporter:
            ross@mongodb.com Ross Lawley
            Votes:
            1 Vote for this issue
            Watchers:
            6 Start watching this issue

              Created:
              Updated: