-
Type: Bug
-
Resolution: Won't Fix
-
Priority: Minor - P4
-
None
-
Affects Version/s: 3.0.3, 3.2
-
Component/s: None
-
None
-
Environment:python 2.7.10
pymongo Version: 3.2
db version v3.2.12
[2018-03-21 11:28:14,666: WARNING/ForkPoolWorker-18] /opt/python2.7.10-customized/lib/python2.7/site-packages/celery/app/trace.py:542: RuntimeWarning: Exception raised outside body: DuplicateKeyError(u'E11000 duplicate key error collection: celery.celery_taskmeta index: _id_ dup key: { : "d832c36e-8f89-4a17-ba20-3922a89ce39a" }',): Traceback (most recent call last): File "/opt/python2.7.10-customized/lib/python2.7/site-packages/celery/app/trace.py", line 362, in trace_task request=task_request, File "/opt/python2.7.10-customized/lib/python2.7/site-packages/celery/backends/base.py", line 309, in store_result request=request, **kwargs) File "/opt/python2.7.10-customized/lib/python2.7/site-packages/celery/backends/mongodb.py", line 175, in _store_result self.collection.save(meta) File "/opt/python2.7.10-customized/lib/python2.7/site-packages/pymongo/collection.py", line 1906, in save check_keys, False, manipulate, write_concern) File "/opt/python2.7.10-customized/lib/python2.7/site-packages/pymongo/collection.py", line 535, in _update _check_write_command_response(results) File "/opt/python2.7.10-customized/lib/python2.7/site-packages/pymongo/helpers.py", line 260, in _check_write_command_response raise DuplicateKeyError(error.get("errmsg"), 11000, error) DuplicateKeyError: E11000 duplicate key error collection: celery.celery_taskmeta index: _id_ dup key: { : "d832c36e-8f89-4a17-ba20-3922a89ce39a" }
I use MongoDB as result backend of celery.
this error happens rarely.
I checked the celery source code. `self.collection.save(meta)` mete always has `_id` field, so `save` would call `_update`.
- is related to
-
SERVER-14322 Retry on predicate unique index violations of update + upsert -> insert when possible
- Closed