Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-288

Spark Connector 3.0.1 MongoSpark local class incompatible error

    • Type: Icon: Bug Bug
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: 3.0.1
    • Component/s: API
    • None
    • Environment:
      Spark Cluster has multiple workers running in docker containers on a docker host

      I am using Apache Spark Java API and Mongo Spark Connector in PlayFramework 2.8.7

      I have updated "mongo-spark-connector" in my build.sbt from version 3.0.0 to version 3.0.1. Now running job on spark cluster I got a "local class incompatible" error:

      org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 10.0 failed 4 times, most recent failure: Lost task 0.3 in stage 10.0 (TID 26, 172.18.0.2, executor 1): java.io.InvalidClassException: com.mongodb.spark.MongoSpark$; local class incompatible: stream classdesc serialVersionUID = -148646310337786170, local class serialVersionUID = -3005450305892693805

      This normally happens when a class object implements Serializable doesn't define the serialVersionUID in Java. But this error doesn't occur in Spark Connector 3.0.0. Any hints are welcome.

            Assignee:
            Unassigned Unassigned
            Reporter:
            wang@pms.ifi.lmu.de Yingding Wang
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: