I'm using Python 3's builtin functools.lru_cache
decorator to memoize some expensive functions. I would like to memoize as many calls as possible without using too much memory, since caching too many values causes thrashing.
Is there a preferred technique or library for accomplishing this in Python?
For example, this question lead me to a Go library for system memory aware LRU caching. Something similar for Python would be ideal.
Note: I can't just estimate the memory used per value and set maxsize
accordingly, since several processes will be calling the decorated function in parallel; a solution would need to actually dynamically check how much memory is free.
lru_cache
implementation? The easiest way would be to simply check the memory usage within the decorator. It would add some overhead for sure, but in this application I don't think it would be significant. – Comateimport functools
and just enter the module namefunctools
. Works for any locating the source of nearly any Python module (except C extensions of course). – Frakturfunctools.lru_cache
was written by Raymond Hettinger. He posted several different (LRU) caching / memoization recipes, maybe you can find something useful or at least inspirational in those ;-) – Fraktur