I'm looping 1000 times with time delay 1ms and calculating total time. It's very interesting how the total time is 15.6 seconds instead of 1. When I opened Google Chrome and surfed some websites, it ran correctly with 1 sec total. Also, it ran fine with Macbook too. I'm wondering what kind of solutions that I need to do to fix this problem? Please try to run it without Chrome opened an again with Chrome opened to see the difference. It ran normally when Quora or Reddit or Stackoverflow opened on my system.
from timeit import default_timer as timer
import time
start = timer()
for i in range(1000):
time.sleep(0.001)
end = timer()
print ("Total time: ", end - start)
Edit: I didn't run it on Python. I just opened up Chrome and browsed some websites to speed up the time delay.
Updated: It's about the timer resolution from Windows. So basically, Chrome changed the timer resolution from 15.6ms to 1ms. This article explains very well: https://randomascii.wordpress.com/2013/07/08/windows-timer-resolution-megawatts-wasted/
sleep(0.001)
will sleep for exactly a millisecond, rather than just at least 1ms. You might want to read stackoverflow.com/questions/9518106 – Fridlundsleep
implementation is indeed interrupt based. – Weedensleep
. – Lagrange