Throughput mode gives me 21 ops/ms. So, I'd thought on average one op takes 1/21 ms. Average mode however gives me the score of 0.937 ms/op. How to interpret the results?
Throughput mode:
Result "testMethod":
21.752 ±(99.9%) 0.081 ops/ms [Average]
(min, avg, max) = (20.681, 21.752, 22.529), stdev = 0.344
CI (99.9%): [21.671, 21.834] (assumes normal distribution)
# Run complete. Total time: 00:06:47
Benchmark Mode Cnt Score Error Units
MyBenchmark.testMethod thrpt 200 21.752 ± 0.081 ops/ms
Average mode:
Result "testMethod":
0.937 ±(99.9%) 0.004 ms/op [Average]
(min, avg, max) = (0.890, 0.937, 0.979), stdev = 0.019
CI (99.9%): [0.932, 0.941] (assumes normal distribution)
# Run complete. Total time: 00:06:47
Benchmark Mode Cnt Score Error Units
MyBenchmark.testMethod avgt 200 0.937 ± 0.004 ms/op