[2020-07-14 19:13:51,720] INFO An exception occurred when trying to get the next item from the changestream. (com.mongodb.kafka.connect.source.MongoSourceTask) com.mongodb.MongoCommandException: Command failed with error 10334 (Location10334): 'BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000002461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) }' on server mongo1:27017. The full response is {"operationTime": {"$timestamp": {"t": 1594667630, "i": 2}}, "ok": 0.0, "errmsg": "BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000002461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) }", "code": 10334, "codeName": "Location10334", "$clusterTime": {"clusterTime": {"$timestamp": {"t": 1594667630, "i": 2}}, "signature": {"hash": {"$binary": "AAAAAAAAAAAAAAAAAAAAAAAAAAA=", "$type": "00"}, "keyId": {"$numberLong": "0"}}}} at com.mongodb.internal.connection.ProtocolHelper.getCommandFailureException(ProtocolHelper.java:175) at com.mongodb.internal.connection.InternalStreamConnection.receiveCommandMessageResponse(InternalStreamConnection.java:303) at com.mongodb.internal.connection.InternalStreamConnection.sendAndReceive(InternalStreamConnection.java:259) at com.mongodb.internal.connection.UsageTrackingInternalConnection.sendAndReceive(UsageTrackingInternalConnection.java:99) at com.mongodb.internal.connection.DefaultConnectionPool$PooledConnection.sendAndReceive(DefaultConnectionPool.java:450) at com.mongodb.internal.connection.CommandProtocolImpl.execute(CommandProtocolImpl.java:72) at com.mongodb.internal.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:226) at com.mongodb.internal.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:269) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:131) at com.mongodb.internal.connection.DefaultServerConnection.command(DefaultServerConnection.java:123) at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:343) at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:334) at com.mongodb.operation.CommandOperationHelper.executeCommandWithConnection(CommandOperationHelper.java:220) at com.mongodb.operation.CommandOperationHelper$5.call(CommandOperationHelper.java:206) at com.mongodb.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:463) at com.mongodb.operation.CommandOperationHelper.executeCommand(CommandOperationHelper.java:203) at com.mongodb.operation.AggregateOperationImpl.execute(AggregateOperationImpl.java:200) at com.mongodb.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:340) at com.mongodb.operation.ChangeStreamOperation$1.call(ChangeStreamOperation.java:336) at com.mongodb.operation.OperationHelper.withReadConnectionSource(OperationHelper.java:463) at com.mongodb.operation.ChangeStreamOperation.execute(ChangeStreamOperation.java:336) at com.mongodb.operation.ChangeStreamBatchCursor.resumeableOperation(ChangeStreamBatchCursor.java:181) at com.mongodb.operation.ChangeStreamBatchCursor.tryNext(ChangeStreamBatchCursor.java:83) at com.mongodb.client.internal.MongoChangeStreamCursorImpl.tryNext(MongoChangeStreamCursorImpl.java:78) at com.mongodb.kafka.connect.source.MongoSourceTask.getNextDocument(MongoSourceTask.java:338) at com.mongodb.kafka.connect.source.MongoSourceTask.poll(MongoSourceTask.java:155) at org.apache.kafka.connect.runtime.WorkerSourceTask.poll(WorkerSourceTask.java:270) at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:237) at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:184) at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:234) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) [2020-07-14 19:13:51,728] DEBUG Poll await time passed before reaching max batch size returning 2 records (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:51,739] INFO Committing files after waiting for rotateIntervalMs time but less than flush.size records available. (io.confluent.connect.s3.TopicPartitionWriter) [2020-07-14 19:13:52,091] INFO Files committed to S3. Target commit offset for mongo.test.investigate1-0 is 2 (io.confluent.connect.s3.TopicPartitionWriter) [2020-07-14 19:13:52,449] ERROR Record id: {_id=15} for organization id: in: {db=test, coll=investigate1} is 13134271 which is larger than supported size: 1048576 (com.everbridge.analytics.connector.transform.KeyReplacer) [2020-07-14 19:13:52,450] DEBUG Polling Start: 1594667632450 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:52,450] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:52,798] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:52,806] INFO Resuming the change stream after the previous offset (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:52,808] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:52,812] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:53,058] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:53,058] DEBUG Waiting 4392 ms to poll (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:55,555] INFO WorkerSourceTask{id=mongo-source-1-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask) [2020-07-14 19:13:55,555] INFO WorkerSourceTask{id=mongo-source-1-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask) [2020-07-14 19:13:55,560] INFO WorkerSourceTask{id=mongo-source-1-0} Finished commitOffsets successfully in 5 ms (org.apache.kafka.connect.runtime.WorkerSourceTask) [2020-07-14 19:13:56,271] INFO WorkerSinkTask{id=s3-sink-1-0} Committing offsets asynchronously using sequence number 21: {mongo.test.investigate1-0=OffsetAndMetadata{offset=2, leaderEpoch=null, metadata=''}} (org.apache.kafka.connect.runtime.WorkerSinkTask) [2020-07-14 19:13:57,451] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:57,571] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:57,575] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:57,838] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:57,838] DEBUG Poll await time passed before reaching max batch size returning 0 records (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:57,839] DEBUG Polling Start: 1594667637839 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:57,839] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:58,076] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:58,081] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:58,338] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:13:58,338] DEBUG Waiting 4501 ms to poll (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:02,839] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,103] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,109] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,361] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,361] DEBUG Poll await time passed before reaching max batch size returning 0 records (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,362] DEBUG Polling Start: 1594667643362 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,362] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,609] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,616] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,876] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:03,876] DEBUG Waiting 4486 ms to poll (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:05,561] INFO WorkerSourceTask{id=mongo-source-1-0} Committing offsets (org.apache.kafka.connect.runtime.WorkerSourceTask) [2020-07-14 19:14:05,561] INFO WorkerSourceTask{id=mongo-source-1-0} flushing 0 outstanding messages for offset commit (org.apache.kafka.connect.runtime.WorkerSourceTask) [2020-07-14 19:14:08,363] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:08,636] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:08,643] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:08,878] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:08,878] DEBUG Poll await time passed before reaching max batch size returning 0 records (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:08,878] DEBUG Polling Start: 1594667648878 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:08,878] DEBUG Creating a MongoCursor (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:09,139] INFO Watching for collection changes on 'test.investigate1' (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:09,144] INFO Resuming the change stream after the previous offset using resumeAfter (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:09,377] INFO Failed to resume change stream: BSONObj size: 20889370 (0x13EBF1A) is invalid. Size must be between 0 and 16793600(16MB) First element: _id: { _data: BinData(0, 825F0CB26D00000001461E5F6964002B1E005A1004A99885442A0A41A3B400871ACF7D24B804) } 10334 (com.mongodb.kafka.connect.source.MongoSourceTask) [2020-07-14 19:14:09,377] DEBUG Waiting 4501 ms to poll (com.mongodb.kafka.connect.source.MongoSourceTask)