Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-45

Reuse sqlContext when creating new dataframes

    • Type: Icon: Improvement Improvement
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • 0.3
    • Affects Version/s: None
    • Component/s: None
    • None

      Currently, sqlContext.read.option("uri", inputUri).mongo() passes the the spark context to MongoRDD and creates a new SqlContext.

      The existing one can be reused.

            Assignee:
            Unassigned Unassigned
            Reporter:
            ross@mongodb.com Ross Lawley
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated:
              Resolved: