Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-428

Mongo connector 10.2.3 throws error

    • Type: Icon: Bug Bug
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: 10.2.3
    • Component/s: API, build
    • None
    • Java Drivers
    • Hide

      1. What would you like to communicate to the user about this feature?
      2. Would you like the user to see examples of the syntax and/or executable code and its output?
      3. Which versions of the driver/connector does this apply to?

      Show
      1. What would you like to communicate to the user about this feature? 2. Would you like the user to see examples of the syntax and/or executable code and its output? 3. Which versions of the driver/connector does this apply to?

      Trying to use spark mongo connector with the following versions and getting an error, see below:

      versions

      • mongo-spark-connector-10.2.3
      • SPARK_VERSION=3.5.0
      • SCALA_VERSION_BASE=2.13
      • SCALA_VERSION=2.13.12

       

      Jars

      • mongodb-driver-core-5.1.0.jar
      • bson-5.1.0.jar
      • mongo-spark-connector_2.13-10.2.3.jar
      • mongodb-driver-sync-5.1.0.jar

      Dockerfile: https://github.com/rockthejvm/spark-essentials/blob/master/spark-cluster/docker/base/Dockerfile

       
      Code:
       

      spark: SparkSession = SparkSession.builder.appName("myApp").getOrCreate()
      df = ( spark.read.format("mongodb") .option("database","v1") .option("collection","contacts") .option( "spark.mongodb.read.connection.uri", "mongodb+srv://xxx/", ) .load() )
      filtered_df = df.filter(df["place_id"] == "id") 
      print(filtered_df.collect()[0]) ## this fails
      

       

      Error:

       java.lang.NoSuchMethodError: 'org.apache.spark.sql.catalyst.encoders.ExpressionEncoder org.apache.spark.sql.catalyst.encoders.RowEncoder$.apply(org.apache.spark.sql.types.StructType)'
          at com.mongodb.spark.sql.connector.schema.RowToInternalRowFunction.<init>(RowToInternalRowFunction.java:40)
          at com.mongodb.spark.sql.connector.schema.BsonDocumentToRowConverter.<init>(BsonDocumentToRowConverter.java:99)
          at com.mongodb.spark.sql.connector.read.MongoBatch.<init>(MongoBatch.java:47)
          at com.mongodb.spark.sql.connector.read.MongoScan.toBatch(MongoScan.java:61)
       

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            brandon@pursuit.us Brandon Max
            Ross Lawley Ross Lawley
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: