-
Type: Task
-
Resolution: Unresolved
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
StorEng - Refinement Pipeline
The WiredTiger automated performance tests aim to give signal about introduced performance regressions.
They are currently run using "regular" Evergreen hosts, which don't provide predictable I/O performance. So any performance test that requires I/O won't result in repeatable results.
andrew.morton@mongodb.com encountered this recently when reviewing the performance consequence of changing how assertions are implemented in WiredTiger.
We should:
- Review the hardware used when running performance tests, and ensure it is capable of generating repeatable results.
- Review the workloads being run in our automated performance testing, and ensure that the workloads could give predictable results (i.e: is generally within the capacity of the hardware being used).
- Clearly identify tests that exceed the hardware capability and identify the value they provide. Attempt to describe metrics that can be predictably measured from those tests, or at least document a description of their value somewhere.
- related to
-
WT-10714 Select an explicitly labeled perf distro for performance tests
- Closed