Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-250

with mongodb connector, spark stuck at the last task.

    • Type: Icon: Task Task
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: 2.3.2
    • Component/s: Reads
    • None
    • Environment:
      linux, mongodb3.x, spark2.3.1, scala2.11.11

      //
      val rdd = Global.sparkContext.loadFromMongoDB(...).withPipeline(...)
      
      rdd.count()

      it always stuck at the last task. it may take 30 minutes to finish this last task, or maybe hange foreaver. No exception or error is found.

       

      If it just reads few records, for example, 2000 records, it could finish the last task quickly. If it reads above 100000 records, it will hange there.

      I have set 

      System.setProperty("spark.mongodb.keep_alive_ms", "1000000"), but it does not help.

       

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            efenzha Feng Zhang
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: