Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-93

MongoSpark ignoring ReadConfigs for Multiple Collections

    • Type: Icon: Bug Bug
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • 2.0.0
    • Affects Version/s: 2.0.0-rc1
    • Component/s: Configuration
    • None
    • Environment:
      Windows 10, Spark 2.0.0, Scala 2.11.2

      I have a few collections that I need to read in, I've tried following the documentation on how to set ReadConfig, but I'm still coming up blank.

      I've attached a Scala object, where I try connecting to 2 different collections. MongoSpark ignores my read config in the second instance, and continues to read from the first collection.

      I've spent about 4 hours trying to figure this out. I've tried many permutations, which all yield errors.

      The attached are with:

      • Mongo settings NOT specified in spark-defaults.conf
      • No other --conf options passed to my Spark master/driver

        1. multiplecollections.log
          39 kB
        2. MultipleCollections.scala
          1 kB

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            nevi_me Neville Dipale
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: