-
Type: Bug
-
Resolution: Fixed
-
Priority: Unknown
-
Affects Version/s: 10.0.1
-
Component/s: None
-
None
What did I do
df = ( spark .sql("SELECT CAST('2022-05-01' AS DATE)") )( df .write .format("mongodb") .mode("append") .save() )
What do I expect
The df is written to mongoldb
What did I see
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 8.0 failed 4 times, most recent failure: Lost task 0.3 in stage 8.0 (TID 28) (10.221.238.71 executor 1): com.mongodb.spark.sql.connector.exceptions.DataException: Cannot cast [2022-05-01] into a BsonValue. StructType(StructField(CAST(2022-05-01 AS DATE),DateType,false)) has no matching BsonValue. Error: Cannot cast 2022-05-01 into a BsonValue. DateType has no matching BsonValue. Error: java.sql.Date cannot be cast to java.sql.Timestamp
What did I find
I found that the connector tries to cast the date which is `java.sqlDate` into `java.sql.Timestamp`, which cause the error.
I have created a PR on GitHub for fixing the issue.