you are hitting the limitation of Windows file system itself. When number of files in a directory grows to a large number (and 14M is way beyond that threshold), accessing the directory becomes incredibly slow. It doesn't really matter if you read one file at a time or 1000, it's just directory access.
One way to solve this is to create subdirectories and break apart your files into groups. If each directory has 1000-5000 (guessing but you can experiment with actual numbers), then you should get decent performance opening/creating/deleting files.
This is why if you look at applications like Doxygen, which creates a file for every class, they follow this scheme and put everything into 2 levels of subdirectories which use random names.
DirectoryInfo.GetFiles()
is also horrible if you are using a network SAN. It locks all files and blocks others from accessing recently created SAN files. We never did find a non-blocking resolution. – Owl