Ignore folders/files when Directory.GetFiles() is denied access
Asked Answered
A

9

82

I am trying to display a list of all files found in the selected directory (and optionally any subdirectories). The problem I am having is that when the GetFiles() method comes across a folder that it cannot access, it throws an exception and the process stops.

How do I ignore this exception (and ignore the protected folder/file) and continue adding accessible files to the list?

try
{
    if (cbSubFolders.Checked == false)
    {
        string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath);
        foreach (string fileName in files)
            ProcessFile(fileName);
    }
    else
    {
        string[] files = Directory.GetFiles(folderBrowserDialog1.SelectedPath, "*.*", SearchOption.AllDirectories);
        foreach (string fileName in files)
            ProcessFile(fileName);
    }
    lblNumberOfFilesDisplay.Enabled = true;
}
catch (UnauthorizedAccessException) { }
finally {}
Akim answered 5/10, 2008 at 20:0 Comment(0)
K
50

You will have to do the recursion manually; don't use AllDirectories - look one folder at a time, then try getting the files from sub-dirs. Untested, but something like below (note uses a delegate rather than building an array):

using System;
using System.IO;
static class Program
{
    static void Main()
    {
        string path = ""; // TODO
        ApplyAllFiles(path, ProcessFile);
    }
    static void ProcessFile(string path) {/* ... */}
    static void ApplyAllFiles(string folder, Action<string> fileAction)
    {
        foreach (string file in Directory.GetFiles(folder))
        {
            fileAction(file);
        }
        foreach (string subDir in Directory.GetDirectories(folder))
        {
            try
            {
                ApplyAllFiles(subDir, fileAction);
            }
            catch
            {
                // swallow, log, whatever
            }
        }
    }
}
Kylakylah answered 5/10, 2008 at 20:21 Comment(6)
Too good, I have not found anything like this in VB.NET. Hope you don't mind if I have translated this in VB.NET hereTwinberry
still not enough: GetFiles throws internally when only one file is inaccessible in a folder. So the whole folder will be untreated.Killingsworth
It's considerably slower though.Druce
I'm not sure how this is the accepted answer. GetFiles or GetDirectories can both throw exceptions causing you to not see anything after the exception.Reynolds
@Reynolds the comment "swallow, log, whatever" is an invitation for you to add whatever handling is appropriate for your scenarioKylakylah
@MarcGravell I don't want to do anything in the catch block. I want to move on to the next file or directory after the one that threw an exception, which is all outside of your try block. HINT: Run that code on the root of your `C:` and it will never make it out of the root folder.Reynolds
A
28

Since .NET Standard 2.1 (.NET Core 3+, .NET 5+), you can now just do:

var filePaths = Directory.EnumerateFiles(@"C:\my\files", "*.xml", new EnumerationOptions
{
    IgnoreInaccessible = true,
    RecurseSubdirectories = true
});

According to the MSDN docs about IgnoreInaccessible:

Gets or sets a value that indicates whether to skip files or directories when access is denied (for example, UnauthorizedAccessException or SecurityException). The default is true.

Default value is actually true, but I've kept it here just to show the property.

The same overload is available for DirectoryInfo as well.

Asynchronism answered 18/5, 2020 at 11:8 Comment(5)
This works in .NET Core 2.1 too, but not .NET Standard 2.0Goodrow
This is the only answer that actually works but it requires you change your target framework.Reynolds
This helped a lotVulnerable
This function overload is also included in NET 5 and NET 6 RC 2.Freshman
This is available for Net standard 2.0 via downlevel using Microsoft.IO.Redist in nugetCredulous
A
16

This simple function works well and meets the questions requirements.

private List<string> GetFiles(string path, string pattern)
{
    var files = new List<string>();
    var directories = new string[] { };

    try
    {
        files.AddRange(Directory.GetFiles(path, pattern, SearchOption.TopDirectoryOnly));
        directories = Directory.GetDirectories(path);
    }
    catch (UnauthorizedAccessException) { }

    foreach (var directory in directories)
        try
        {
            files.AddRange(GetFiles(directory, pattern));
        }
        catch (UnauthorizedAccessException) { }

    return files;
}
Aborn answered 26/6, 2014 at 20:58 Comment(3)
Yes, because of the missing error handling, it is of not much use. Just try to search the whole c:\ tree. There are a number of areas in Windows file system where the user maybe even with admin rights has not enough access rights. That's what the main challenge is all about here (besides junction points, and such).Impracticable
Same problem here. If you throw an exception in GetFiles command then you can't see anything after that.Reynolds
@Reynolds please try with the edits just made to allow the search to continue after a directory denies access.Aborn
V
8

A simple way to do this is by using a List for files and a Queue for directories. It conserves memory. If you use a recursive program to do the same task, that could throw OutOfMemory exception. The output: files added in the List, are organised according to the top to bottom (breadth first) directory tree.

public static List<string> GetAllFilesFromFolder(string root, bool searchSubfolders) {
    Queue<string> folders = new Queue<string>();
    List<string> files = new List<string>();
    folders.Enqueue(root);
    while (folders.Count != 0) {
        string currentFolder = folders.Dequeue();
        try {
            string[] filesInCurrent = System.IO.Directory.GetFiles(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
            files.AddRange(filesInCurrent);
        }
        catch {
            // Do Nothing
        }
        try {
            if (searchSubfolders) {
                string[] foldersInCurrent = System.IO.Directory.GetDirectories(currentFolder, "*.*", System.IO.SearchOption.TopDirectoryOnly);
                foreach (string _current in foldersInCurrent) {
                    folders.Enqueue(_current);
                }
            }
        }
        catch {
            // Do Nothing
        }
    }
    return files;
}

Steps:

  1. Enqueue the root in the queue
  2. In a loop, Dequeue it, Add the files in that directory to the list, and Add the subfolders to the queue.
  3. Repeat untill the queue is empty.
Vesture answered 15/8, 2016 at 16:40 Comment(0)
U
2

see https://mcmap.net/q/145674/-need-to-resume-try-after-catch-block for a solution that handles the UnauthorisedAccessException problem.

All the solutions above will miss files and/or directories if any calls to GetFiles() or GetDirectories() are on folders with a mix of permissions.

Unrivalled answered 24/5, 2012 at 13:54 Comment(1)
all solutions involving GetFiles/GetDirectories are bound to have the same problem, and thus are a bit inelegantHanaper
S
2

Here's a full-featured, .NET 2.0-compatible implementation.

You can even alter the yielded List of files to skip over directories in the FileSystemInfo version!

(Beware null values!)

public static IEnumerable<KeyValuePair<string, string[]>> GetFileSystemInfosRecursive(string dir, bool depth_first)
{
    foreach (var item in GetFileSystemObjectsRecursive(new DirectoryInfo(dir), depth_first))
    {
        string[] result;
        var children = item.Value;
        if (children != null)
        {
            result = new string[children.Count];
            for (int i = 0; i < result.Length; i++)
            { result[i] = children[i].Name; }
        }
        else { result = null; }
        string fullname;
        try { fullname = item.Key.FullName; }
        catch (IOException) { fullname = null; }
        catch (UnauthorizedAccessException) { fullname = null; }
        yield return new KeyValuePair<string, string[]>(fullname, result);
    }
}

public static IEnumerable<KeyValuePair<DirectoryInfo, List<FileSystemInfo>>> GetFileSystemInfosRecursive(DirectoryInfo dir, bool depth_first)
{
    var stack = depth_first ? new Stack<DirectoryInfo>() : null;
    var queue = depth_first ? null : new Queue<DirectoryInfo>();
    if (depth_first) { stack.Push(dir); }
    else { queue.Enqueue(dir); }
    for (var list = new List<FileSystemInfo>(); (depth_first ? stack.Count : queue.Count) > 0; list.Clear())
    {
        dir = depth_first ? stack.Pop() : queue.Dequeue();
        FileSystemInfo[] children;
        try { children = dir.GetFileSystemInfos(); }
        catch (UnauthorizedAccessException) { children = null; }
        catch (IOException) { children = null; }
        if (children != null) { list.AddRange(children); }
        yield return new KeyValuePair<DirectoryInfo, List<FileSystemInfo>>(dir, children != null ? list : null);
        if (depth_first) { list.Reverse(); }
        foreach (var child in list)
        {
            var asdir = child as DirectoryInfo;
            if (asdir != null)
            {
                if (depth_first) { stack.Push(asdir); }
                else { queue.Enqueue(asdir); }
            }
        }
    }
}
Shaven answered 16/4, 2018 at 6:4 Comment(0)
M
1

This should answer the question. I've ignored the issue of going through subdirectories, I'm assuming you have that figured out.

Of course, you don't need to have a seperate method for this, but you might find it a useful place to also verify the path is valid, and deal with the other exceptions that you could encounter when calling GetFiles().

Hope this helps.

private string[] GetFiles(string path)
{
    string[] files = null;
    try
    {
       files = Directory.GetFiles(path);
    }
    catch (UnauthorizedAccessException)
    {
       // might be nice to log this, or something ...
    }

    return files;
}

private void Processor(string path, bool recursive)
{
    // leaving the recursive directory navigation out.
    string[] files = this.GetFiles(path);
    if (null != files)
    {
        foreach (string file in files)
        {
           this.Process(file);
        }
    }
    else
    {
       // again, might want to do something when you can't access the path?
    }
}
Mallorca answered 5/10, 2008 at 20:35 Comment(1)
Same problem here. Any directory that contains a single exception causes you to lose the whole directory listing.Reynolds
C
1

I prefer using c# framework functions, but the function i need will be included in .net framework 5.0, so i have to write it.

// search file in every subdirectory ignoring access errors
    static List<string> list_files(string path)
    {
        List<string> files = new List<string>();

        // add the files in the current directory
        try
        {
            string[] entries = Directory.GetFiles(path);

            foreach (string entry in entries)
                files.Add(System.IO.Path.Combine(path,entry));
        }
        catch 
        { 
        // an exception in directory.getfiles is not recoverable: the directory is not accessible
        }

        // follow the subdirectories
        try
        {
            string[] entries = Directory.GetDirectories(path);

            foreach (string entry in entries)
            {
                string current_path = System.IO.Path.Combine(path, entry);
                List<string> files_in_subdir = list_files(current_path);

                foreach (string current_file in files_in_subdir)
                    files.Add(current_file);
            }
        }
        catch
        {
            // an exception in directory.getdirectories is not recoverable: the directory is not accessible
        }

        return files;
    }
Crocein answered 4/8, 2020 at 8:2 Comment(0)
C
1

For those who target framework is bellow NET 2.1 just get Microsoft.IO.Redist on nuget.

var filePaths = Directory.EnumerateFiles(@"C:\my\files", "*.xml", new EnumerationOptions
{
    IgnoreInaccessible = true,
    RecurseSubdirectories = true
});
Credulous answered 26/7, 2023 at 17:31 Comment(1)
This repeats much earlier answer.Saxony

© 2022 - 2024 — McMap. All rights reserved.