Spark connector v10.0.2 supports Spark server up to v3.2.4. More recent Spark server version (like 3.4.1 and 3.5.0) introduce breaking changes that need support in the Spark connector.
[A community user reported this error when using spark 3.4.1 with Spark Connector 10.2](https://www.mongodb.com/community/forums/t/getting-java-lang-nosuchmethoderror-when-trying-to-write-to-a-collection-via-spark-mongo-connector/237021/2)
The error message the Spark job is failing due to issues related to the InternalRowToRowFunction and RowToBsonDocumentConverter classes from the MongoDB Spark Connector.
23/07/27 13:51:03 INFO DAGScheduler: ResultStage 1 (save at Main.java:25) failed in 12.753 s due to Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 4) (10.0.7.3 executor 1): java.lang.NoSuchMethodError: 'scala.collection.immutable.Seq org.apache.spark.sql.types.StructType.toAttributes()' at com.mongodb.spark.sql.connector.schema.InternalRowToRowFunction.<init>(InternalRowToRowFunction.java:46)