Throttle a function call in python
Asked Answered
P

2

6

I have the following type of code, but it is slow because report() is called very often.

import time
import random

def report(values):
    open('report.html', 'w').write(str(values))

values = []

for i in range(10000):
    # some computation
    r = random.random() / 100.
    values.append(r)
    time.sleep(r)
    # report on the current status, but this should not slow things down
    report(values)

In this illustrative code example, I would like the report to be up-to-date (at most 10s old), so I would like to throttle that function.

I could fork in report, write the current timestamp, and wait for that period, and check using a shared memory timestamp if report has been called in the meantime. If yes, terminate, if not, write the report.

Is there a more elegant way to do it in Python?

Panto answered 10/8, 2015 at 14:58 Comment(6)
Use threading with a shared queue?Apposition
I imagine it's slow because you're opening the file every time (which should also be closed). If you keep the file open (pass it into the report function or create reporter class), it might not take as long.Marden
You're intending to overwrite the file each time?Viveca
What @Trengot said. You should open the file outside of the for loop, and then pass the opened file to the report() function as a second parameter. Then close the file after the for loop has ended. Though your whole report() function becomes def report(f, values): f.write(str(values)), and you might consider inlining it. No need to re-create the file.write() method :)Transilluminate
Since you're only ever adding values to the report file and never changing, could you do something with appending values to the file by opening it in append mode? E.g. you don't need to write all the values every time, you could append to the file every N values.Delmerdelmor
Also, as @Apposition says, if you want to be writing the previous 'batch' wile calculating the next batch, then you chould use threading (or similar) with a shared queue (or similar).Delmerdelmor
V
6

Here's a decorator that will take an argument for how long to protect the inner function for, raising an exception if called too soon.

import time
from functools import partial, wraps

class TooSoon(Exception):
  """Can't be called so soon"""
  pass

class CoolDownDecorator(object):
  def __init__(self,func,interval):
    self.func = func
    self.interval = interval
    self.last_run = 0
  def __get__(self,obj,objtype=None):
    if obj is None:
      return self.func
    return partial(self,obj)
  def __call__(self,*args,**kwargs):
    now = time.time()
    if now - self.last_run < self.interval:
      raise TooSoon("Call after {0} seconds".format(self.last_run + self.interval - now))
    else:
      self.last_run = now
      return self.func(*args,**kwargs)

def CoolDown(interval):
  def applyDecorator(func):
    decorator = CoolDownDecorator(func=func,interval=interval)
    return wraps(func)(decorator)
  return applyDecorator

Then:

>>> @CoolDown(10)
... def demo():
...   print "demo called"
...
>>>
>>> for i in range(12):
...   try:
...     demo()
...   except TooSoon, exc:
...     print exc
...   time.sleep(1)
...
demo called
Call after 8.99891519547 seconds
Call after 7.99776816368 seconds
Call after 6.99661898613 seconds
Call after 5.99548196793 seconds
Call after 4.9943420887 seconds
Call after 3.99319410324 seconds
Call after 2.99203896523 seconds
Call after 1.99091005325 seconds
Call after 0.990563154221 seconds
demo called
Call after 8.99888515472 seconds
Viveca answered 10/8, 2015 at 15:43 Comment(5)
The problem with this approach in the OP's context is that the final N seconds of data will never be written to the file.Delmerdelmor
Good point. However, you could keep an un-decorated version of the function and call that as the final write.Viveca
This solution has the problem that if the duration of a computation (time.sleep here) is random (e.g. between 0.1s and 100s), demo() may not be called for a long time, much longer than the 10s.Panto
@j13r. I've understood the question to ask about throttling. I consider preventing multiple executions within a specified time to be a reasonable attempt at throttling. What you describe as a problem with this solution, I consider to be the feature that provides the solution. If the work takes 100s then there's nothing new to go into the report. Once that iteration of work completes, the report will be written. If the next work takes .01s, then the report won't be updated.Viveca
The difficulty is rather with the reverse scenario, if you have .01s, .01s and then 100s.Panto
T
3

Here is an example of throttling a function using closures in Python3.

import time

def get_current_time_milli():
    return int(round(time.time() * 1000))

def mycallbackfunction():
    time.sleep(0.1)  #mocking some work
    print ("in callback function...")


'''
Throttle a function call using closures.
Don't call the callback function until last invokation is more than 100ms ago.
Only works with python 3. 
Caveat: python 2 we cannot rebind nonlocal variable inside the closure. 

'''

def debouncer(callback, throttle_time_limit=100):

    last_millis = get_current_time_milli()  

    def throttle():
        nonlocal last_millis
        curr_millis = get_current_time_milli()
        if (curr_millis - last_millis) > throttle_time_limit:
            last_millis = get_current_time_milli()
            callback()

return throttle    

# 
myclosure_function = debouncer(mycallbackfunction, 100)

# we are calling myclosure_function 20 times, but only few times, the callback is getting executed. 

# some event triggers this call repeatedly. 
for i in range(20):
    print('calling my closure', myclosure_function(), get_current_time_milli())
Trollope answered 25/1, 2019 at 21:44 Comment(1)
Why are you using millisecondsFenrir

© 2022 - 2024 — McMap. All rights reserved.