Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-189

Spark MongoDb connector unable to get schema right

    • Type: Icon: Task Task
    • Resolution: Works as Designed
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: Schema
    • None

      Hi,

      I'm moving data from one collection to another in other cluster using Spark. the data's schema is not consistent(I mean that has few schema's in a single collection with little variations). When I try to read data from spark, the sampling is unable to get all the schema's of the data and throwing the below error.

       

      com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast ARRAY into a NullType (value: BsonArray{values=[
      { "type" : "GUEST_FEE", "appliesPer" : "GUEST_PER_NIGHT", "description" : null, "minAmount" : 33, "maxAmount" : 33 }
      ]})
      

       

      Is it a known issue or any solution for this?

       

      Thanks,

      Srini

       

       

       

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            srinu.gajjala321 srinivas rao gajjala
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: