Spark MongoDb connector unable to get schema right

XMLWordPrintableJSON

    • Type: Task
    • Resolution: Works as Designed
    • Priority: Major - P3
    • None
    • Affects Version/s: None
    • Component/s: Schema
    • None
    • None
    • None
    • None
    • None
    • None
    • None

      Hi,

      I'm moving data from one collection to another in other cluster using Spark. the data's schema is not consistent(I mean that has few schema's in a single collection with little variations). When I try to read data from spark, the sampling is unable to get all the schema's of the data and throwing the below error.

       

      com.mongodb.spark.exceptions.MongoTypeConversionException: Cannot cast ARRAY into a NullType (value: BsonArray{values=[
      { "type" : "GUEST_FEE", "appliesPer" : "GUEST_PER_NIGHT", "description" : null, "minAmount" : 33, "maxAmount" : 33 }
      ]})
      

       

      Is it a known issue or any solution for this?

       

      Thanks,

      Srini

       

       

       

            Assignee:
            Ross Lawley
            Reporter:
            srinivas rao gajjala
            None
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: