I have insane big directory. I need to get filelist via python.
In code i need to get iterator, not list. So this not work:
os.listdir
glob.glob (uses listdir!)
os.walk
I cant find any good lib. help! Maybe c++ lib?
I have insane big directory. I need to get filelist via python.
In code i need to get iterator, not list. So this not work:
os.listdir
glob.glob (uses listdir!)
os.walk
I cant find any good lib. help! Maybe c++ lib?
If you have a directory that is too big for libc readdir() to read it quickly, you probably want to look at the kernel call getdents() (http://www.kernel.org/doc/man-pages/online/pages/man2/getdents.2.html ). I ran into a similar problem and wrote a long blog post about it.
http://www.olark.com/spw/2011/08/you-can-list-a-directory-with-8-million-files-but-not-with-ls/
Basically, readdir() only reads 32K of directory entries at a time, and so if you have a lot of files in a directory, readdir() will take a very long time to complete.
for python 2.X
import scandir
scandir.walk()
for python 3.5+
os.scandir()
If you have a directory that is too big for libc readdir() to read it quickly, you probably want to look at the kernel call getdents() (http://www.kernel.org/doc/man-pages/online/pages/man2/getdents.2.html ). I ran into a similar problem and wrote a long blog post about it.
http://www.olark.com/spw/2011/08/you-can-list-a-directory-with-8-million-files-but-not-with-ls/
Basically, readdir() only reads 32K of directory entries at a time, and so if you have a lot of files in a directory, readdir() will take a very long time to complete.
I found this library useful: https://github.com/benhoyt/scandir.
i think that using opendir would work and there is a python package: http://pypi.python.org/pypi/opendir/0.0.1 that wraps it via pyrex
You should use generator. This problem is discussed here: http://bugs.python.org/issue11406
Someone built a python module off that article that wraps getdents
. Btw, I know this post is old, but you could use scandir
(and I have done that with dirs with 21 million files). Walk is way too slow though it is also a generator but too much overhead.
This module seems like it would have been an interesting alternative. Have not used it, but he did base it off 8 million files LS article referenced above. Reading through the code, thinking this would have been fun and faster to use.
Also allows you to tweak the buffer without having to go into C directly.
https://github.com/ZipFile/python-getdents And via pip and pypi though I recommend reading the docs.
I found this library really fast.
https://pypi.org/project/scandir/
I used below code from this library, it worked like a charm.
def subdirs(path):
"""Yield directory names not starting with '.' under given path."""
for entry in os.scandir(path):
if not entry.name.startswith('.') and entry.is_dir():
yield entry.name
http://docs.python.org/release/2.6.5/library/os.html#os.walk
>>> import os
>>> type(os.walk('/'))
<type 'generator'>
listdir
internally. –
Untouched How about glob.iglob? It's the iterator glob.
© 2022 - 2024 — McMap. All rights reserved.