Uploaded image for project: 'Spark Connector'
  1. Spark Connector
  2. SPARK-377

Add the ability to specify a comment in the spark.option

    • Type: Icon: New Feature New Feature
    • Resolution: Fixed
    • Priority: Icon: Major - P3 Major - P3
    • 10.2.0
    • Affects Version/s: None
    • Component/s: None
    • None
    • Needed
    • Hide

      1. What would you like to communicate to the user about this feature?
      New comment configuration for read / write commands.
      2. Would you like the user to see examples of the syntax and/or executable code and its output?
      Comments appear in the profiler database. See: https://www.mongodb.com/docs/manual/reference/database-profiler/#mongodb-data-system.profile.command
      3. Which versions of the driver/connector does this apply to?
      10.2.0

      Show
      1. What would you like to communicate to the user about this feature? New comment configuration for read / write commands. 2. Would you like the user to see examples of the syntax and/or executable code and its output? Comments appear in the profiler database. See: https://www.mongodb.com/docs/manual/reference/database-profiler/#mongodb-data-system.profile.command 3. Which versions of the driver/connector does this apply to? 10.2.0
    • None
    • None
    • None
    • None
    • None
    • None

      When profiling queries it would be easier to isolate queries coming from spark connector if users could add something like 

       

      spark.read.format("mongodb").option("comment", "query comment").load() 

       

      would be similar to https://www.mongodb.com/docs/manual/reference/operator/query/comment/#mongodb-query-op.-comment.  Where profiler would show the $comment

            Assignee:
            ross@mongodb.com Ross Lawley
            Reporter:
            robert.walters@mongodb.com Robert Walters (Inactive)
            None
            Votes:
            0 Vote for this issue
            Watchers:
            5 Start watching this issue

              Created:
              Updated:
              Resolved: