Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-142

0.00000 is not saved right as NumberDecimal. Decimal scale (12) cannot be greater than precision (1).;

    • Type: Icon: Bug Bug
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • 2.1.2, 2.2.3
    • Affects Version/s: 2.2.0
    • Component/s: Schema
    • None
    • Environment:
      Spark 2.2
      MongoDB 3.4

      I use sparksql jdbc to load data from SQL Server that include 0.000000 [decimal(28,12)], and then save DataFrame into MongoDB, I find

      {"Position" : NumberDecimal("0E-12")}

      is saved in MongoDB. When I load these data from MongoDB to DataFrame to show, the exception Decimal scale (12) cannot be greater than precision (1).; is thrown. If I manually update the document into

      {"Position" : NumberDecimal("0")}

      , it works fine. Is it a bug? Could you tell me how to fix it?

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            windshjw windshjw
            Votes:
            1 Vote for this issue
            Watchers:
            7 Start watching this issue

              Created:
              Updated:
              Resolved: