-
Type: Improvement
-
Resolution: Won't Fix
-
Priority: Major - P3
-
None
-
Affects Version/s: 2.6.11
-
Component/s: Aggregation Framework
-
Fully Compatible
-
Query 10 (02/22/16), Query 11 (03/14/16), Query 12 (04/04/16)
I wonder is it possible to add coalescing optimization to a sequence of $group and $limit in aggregation pipeline?
For example, I have a pipeline like this:
db.messages.aggregate([ {$match: {"company_id" : ObjectId("4c2118ad54397f271b000000")}}, {$sort: {"ct": -1}}, {$group: {_id: "$ti", ts: {$first: "$ct"}}}, {$limit: 10} ])
My goal is to get small subset of $ti ("thread id") that have latest $ct ("conversation timestamp"). Execution time for this request is prohibitively high and I guess it's because $group stage process all input documents and apply limit afterwards.
It seems that incorporating $limit into $group processing would benefit this query by avoiding computations that will be thrown away in any case.
- related to
-
SERVER-4507 aggregation: optimize $group to take advantage of sorted sequences
- Backlog