Uploaded image for project: 'Core Server'
  1. Core Server
  2. SERVER-58131

Push down $lookup through $_internalUnpackBucket

    • Type: Icon: Improvement Improvement
    • Resolution: Unresolved
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: None
    • Query Integration

      With time-series data, you might want to store some metadata in separate collection. For example if each event is associated with a site (a location), you could have just the site name in each event, and the site details in separate collection:

      > db.events.find()
      { ... meta: { site: 'A' }, ... }
      { ... meta: { site: 'B' }, ... }
      ...
      
      > db.sites.find()
      { _id: 'A', ... }
      { _id: 'B', ... }
      

      You would query it by doing a $lookup to pull the details into each event:

      db.events.aggregate([
        {$lookup: {
          from: 'sites', 
          localField: 'meta.site', 
          foreignField: '_id', 
          as: 'meta.site_details',
        }}
      ])
      

      Since this $lookup only reads and writes metadata fields, we should be able to execute it once per bucket, before unpacking:

      db.system.buckets.events.aggregate([
        {$lookup: {
          from: 'sites', 
          localField: 'meta.site', 
          foreignField: '_id', 
          as: 'meta.site_details',
        }},
        {$_internalUnpackBucket: {metaField: 'meta'}},
      ])
      

            Assignee:
            backlog-query-integration [DO NOT USE] Backlog - Query Integration
            Reporter:
            david.percy@mongodb.com David Percy
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Created:
              Updated: