I have a little script that moves files around in my photo collection, but it runs a bit slow.
I think it's because I'm doing one file move at a time. I'm guessing I can speed this up if I do all file moves from one dir to another at the same time. Is there a way to do that?
If that's not the reason for my slowness, how else can I speed this up?
Update:
I don't think my problem is being understood. Perhaps, listing my source code will help explain:
# ORF is the file extension of the files I want to move;
# These files live in dirs shared by JPEG files,
# which I do not want to move.
import os
import re
from glob import glob
import shutil
DIGITAL_NEGATIVES_DIR = ...
DATE_PATTERN = re.compile('\d{4}-\d\d-\d\d')
# Move a single ORF.
def move_orf(src):
dir, fn = os.path.split(src)
shutil.move(src, os.path.join('raw', dir))
# Move all ORFs in a single directory.
def move_orfs_from_dir(src):
orfs = glob(os.path.join(src, '*.ORF'))
if not orfs:
return
os.mkdir(os.path.join('raw', src))
print 'Moving %3d ORF files from %s to raw dir.' % (len(orfs), src)
for orf in orfs:
move_orf(orf)
# Scan for dirs that contain ORFs that need to be moved, and move them.
def main():
os.chdir(DIGITAL_NEGATIVES_DIR)
src_dirs = filter(DATE_PATTERN.match, os.listdir(os.curdir))
for dir in src_dirs:
move_orfs_from_dir(dir)
if __name__ == '__main__':
main()