Allow a configurable limit for number of consecutive duplicates in $sample from random cursor

XMLWordPrintableJSON

    • Type: Improvement
    • Resolution: Unresolved
    • Priority: Major - P3
    • None
    • Affects Version/s: None
    • Component/s: None
    • Query Optimization
    • None
    • 3
    • None
    • None
    • None
    • None
    • None
    • None

      Currently, the sample from random cursor aggregation stage will fail after seeing (hard-coded) 100 consecutive duplicate documents, but there have been cases where changing this limit could help to mitigate issues with mongosync.

            Assignee:
            Unassigned
            Reporter:
            Nicholas Zolnierz
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

              Created:
              Updated: