Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-432

Broken reading filter on structures field with dash character

    • Type: Icon: Bug Bug
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • 10.4.0
    • Affects Version/s: 10.3.0
    • Component/s: Reads
    • Java Drivers
    • Not Needed
    • Hide

      1. What would you like to communicate to the user about this feature?
      2. Would you like the user to see examples of the syntax and/or executable code and its output?
      3. Which versions of the driver/connector does this apply to?

      Show
      1. What would you like to communicate to the user about this feature? 2. Would you like the user to see examples of the syntax and/or executable code and its output? 3. Which versions of the driver/connector does this apply to?

      When trying to apply a filter on a sub-column whose name contains a dash, the filter wrongly returns no rows whatever the condition is.

      A similar problem existed before 10.2.0 with any field containing a dash. The problem has been partially fixed with 10.2.0 but still appears for fields no being at the root level:

      dataset.filter(col("some-field").lt(100)); 
      Successdataset.filter(col("main.someField").lt(100)); 
      Successdataset.filter(col("main.some-field").lt(100)); // Fails silently (returns an empty dataset even if they are matching 
      rowsdataset.cache().filter(col("main.some-field").lt(100)); // Success, since we do the filtering in spark rather than in the connector
      

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            cedric.vaneetvelde@soprabanking.com Cedric van Eetvelde
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: