Fix overflow bug in standard deviation computation. (#10100)
Summary: There was an overflow bug when computing the variance in the HistogramStat class. This manifests, for instance, when running cache_bench with default arguments. This executes 32M lookups/inserts/deletes in a block cache, and then computes (among other things) the variance of the latencies. The variance is computed as ``variance = (cur_sum_squares * cur_num - cur_sum * cur_sum) / (cur_num * cur_num)``, where ``cum_sum_squares`` is the sum of the squares of the samples, ``cur_num`` is the number of samples, and ``cur_sum`` is the sum of the samples. Because the median latency in a typical run is around 3800 nanoseconds, both the ``cur_sum_squares * cur_num`` and ``cur_sum * cur_sum`` terms overflow as uint64_t. Pull Request resolved: https://github.com/facebook/rocksdb/pull/10100 Test Plan: Added a unit test. Run ``make -j24 histogram_test && ./histogram_test``. Reviewed By: pdillinger Differential Revision: D36942738 Pulled By: guidotag fbshipit-source-id: 0af5fb9e2a297a284e8e74c24e604d302906006emain
parent
4f78f9699b
commit
2af132c341
Loading…
Reference in new issue