I am using the LatencyUtils package for tracking and reporting on the behavior of latencies across measurements:
For recording the time by this method, the time unit should be nanosecond, but in my case, the time recorded is in milliseconds. I want to know if there is a better way to record time in milliseconds?
The solution I use now is to multiply all the recorded time by one million. But I still hope that the results are in microseconds, so for the results I get, I divide it by one million.
public void addValue(Long val, long sampleCount) {
sum += val * sampleCount;
for (int i = 0; i < sampleCount; i++) {
latencyStats.recordLatency(val*1000000);
}
histogram.add(latencyStats.getIntervalHistogram());
max = Math.max(val, max);
min = Math.min(val, min);
updateValueCount(val,sampleCount);
}
@Override
public double getStandardDeviation() {
return histogram.getStdDeviation()/1000000;
}
And the default constructor of LatencyUtil
is like this:
private long lowestTrackableLatency = 1000L; /* 1 usec */
private long highestTrackableLatency = 3600000000000L; /* 1 hr */
private int numberOfSignificantValueDigits = 2;
private int intervalEstimatorWindowLength = 1024;
private long intervalEstimatorTimeCap = 10000000000L; /* 10 sec */
private PauseDetector pauseDetector = null;
public LatencyStats() {
this(
defaultBuilder.lowestTrackableLatency,
defaultBuilder.highestTrackableLatency,
defaultBuilder.numberOfSignificantValueDigits,
defaultBuilder.intervalEstimatorWindowLength,
defaultBuilder.intervalEstimatorTimeCap,
defaultBuilder.pauseDetector
);
}
So, in fact, the lowest trackable latency of LatencyUtil
is also in nanoseconds. If I put a value in milliseconds, I am afraid that will affect the results of the record.