Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-260

Spark Connector using RC version of MongoDB Driver

    • Type: Icon: Bug Bug
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • 2.4.2, 2.3.4, 2.2.8, 2.1.7
    • Affects Version/s: 2.4.1
    • Component/s: API
    • None

      So far I've been using the connector which seemed to work just fine, but just as I updated everything it broke down due to version mismatching across both the mongo-scala-driver and mongo-spark-connector.

      val mongodb        = "org.mongodb.scala"    %% "mongo-scala-driver"    % "2.7.0"
      val sparkConnector = "org.mongodb.spark"    %% "mongo-spark-connector" % "2.4.1"

      Using the previously shown dependencies, the following prompts into the log.

      [error] /Users/testing/.coursier/cache/v1/https/repo1.maven.org/maven2/org/mongodb/mongo-java-driver/3.11.0-rc0/mongo-java-driver-3.11.0-rc0.jar:org/bson/util/ClassMap.class [error] /Users/testing/.coursier/cache/v1/https/repo1.maven.org/maven2/org/mongodb/bson/3.11.0/bson-3.11.0.jar:org/bson/util/ClassMap.class

      Why is the connector using the rc0 instead of the already released version? How can I avoid this in sbt and force the released version?

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            edgarherrero@protonmail.com Eddy H
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: