-
Type: Bug
-
Resolution: Fixed
-
Priority: Major - P3
-
Affects Version/s: None
-
Component/s: Aggregation Framework
-
None
-
Query Execution
-
Fully Compatible
-
ALL
-
(copied to CRM)
Each document is limited to 16MB, but because a change event can be required to report both the document after the update and also the update description, we can easily exceed 16MB.
For example, this can be reproduced with the following:
1. Insert a 10 million character string into the DB
2. Update the document to include a different 10 million character string
- is duplicated by
-
SERVER-64592 Make sure events with large documentKey do not crash change stream
- Closed
- is related to
-
SERVER-81295 Cannot resume V2 changeStream pipelines with V1 resume tokens
- Closed
-
SERVER-67699 Add tracking for when change stream event exceeds 16Mb
- Closed
-
KAFKA-247 Recreate change stream from the point of failure for event > 16 MB
- Closed
- related to
-
SERVER-53387 Large internal metadata can trigger BSONObjectTooLarge for commands under the BSON size limit
- Backlog
-
KAFKA-381 Support change stream split large events
- Needs Triage