Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-56

Make MongoSpark part of the Scala API

    • Type: Icon: Improvement Improvement
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • 0.3
    • Affects Version/s: None
    • Component/s: API
    • None

      In Spark 2.0 Spark Session is going to become the main entry point for Spark[1]. In that context no longer makes sense having MongoRDD as the main entrance for configuring / customising the connector. Instead the plan is to create a MongoSpark case class and companion to fulfil this role. This simplify any future upgrading of the connector.

      Also, an added bonus, is it will allow for the removal of most the `java.api`. Only the Java Bean FieldTypes will still be needed for Java users.

      [1] http://blog.madhukaraphatak.com/introduction-to-spark-two-part-1/

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            ross@mongodb.com Ross Lawley
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

              Created:
              Updated:
              Resolved: