I programmed a scanner that looks for certain files on all hard drives of a system that gets scanned. Some of these systems are pretty old, running Windows 2000 with 256 or 512 MB of RAM but the file system structure is complex as some of them serve as file servers.
I use os.walk() in my script to parse all directories and files.
Unfortunately we noticed that the scanner consumes a lot of RAM after some time of scanning and we figured out that the os.walk function alone uses about 50 MB of RAM after 2h of walk over the file system. This RAM usage increases over the time. We had about 90 MB of RAM after 4 hours of scanning.
Is there a way to avoid this behaviour? We also tried "betterwalk.walk()" and "scandir.walk()". The result was the same. Do we have to write our own walk function that removes already scanned directory and file objects from memory so that the garbage collector can remove them from time to time?
Thanks
os.path.isdir
which is used atos.walk
implementation you can read about it at this post as far as i know it was fixed at python 3, see the leak report here – Lodged