Node.js fs.readdir recursive directory search
Asked Answered
D

49

396

Any ideas on an async directory search using fs.readdir? I realize that we could introduce recursion and call the read directory function with the next directory to read, but I'm a little worried about it not being async...

Any ideas? I've looked at node-walk which is great, but doesn't give me just the files in an array, like readdir does. Although

Looking for output like...

['file1.txt', 'file2.txt', 'dir/file3.txt']
Discoid answered 29/4, 2011 at 3:44 Comment(0)
W
445

There are basically two ways of accomplishing this. In an async environment you'll notice that there are two kinds of loops: serial and parallel. A serial loop waits for one iteration to complete before it moves onto the next iteration - this guarantees that every iteration of the loop completes in order. In a parallel loop, all the iterations are started at the same time, and one may complete before another, however, it is much faster than a serial loop. So in this case, it's probably better to use a parallel loop because it doesn't matter what order the walk completes in, just as long as it completes and returns the results (unless you want them in order).

A parallel loop would look like this:

var fs = require('fs');
var path = require('path');
var walk = function(dir, done) {
  var results = [];
  fs.readdir(dir, function(err, list) {
    if (err) return done(err);
    var pending = list.length;
    if (!pending) return done(null, results);
    list.forEach(function(file) {
      file = path.resolve(dir, file);
      fs.stat(file, function(err, stat) {
        if (stat && stat.isDirectory()) {
          walk(file, function(err, res) {
            results = results.concat(res);
            if (!--pending) done(null, results);
          });
        } else {
          results.push(file);
          if (!--pending) done(null, results);
        }
      });
    });
  });
};

A serial loop would look like this:

var fs = require('fs');
var path = require('path');
var walk = function(dir, done) {
  var results = [];
  fs.readdir(dir, function(err, list) {
    if (err) return done(err);
    var i = 0;
    (function next() {
      var file = list[i++];
      if (!file) return done(null, results);
      file = path.resolve(dir, file);
      fs.stat(file, function(err, stat) {
        if (stat && stat.isDirectory()) {
          walk(file, function(err, res) {
            results = results.concat(res);
            next();
          });
        } else {
          results.push(file);
          next();
        }
      });
    })();
  });
};

And to test it out on your home directory (WARNING: the results list will be huge if you have a lot of stuff in your home directory):

walk(process.env.HOME, function(err, results) {
  if (err) throw err;
  console.log(results);
});

EDIT: Improved examples.

Wasting answered 29/4, 2011 at 4:29 Comment(16)
Actually, the serial version seems to display files in the root directory..but the parallel doesnt...Discoid
Beware, the "parallel loop" answer from chjj above has a bug in cases when an empty folder is walked. The fix is: var pending = list.length; if(!pending)done(null, results); // add this line! list.forEach(function(file) { ...Lamp
@Wasting Maybe I do not see it as clear as you do, but shall I use some a semaphore in order to protect the results array?Flightless
@MrRoth, as far as I know it isn't really necessary. Despite being asynchronous it is also single threaded: the order can't be guaranteed but there are no chances of contention for the results arrayKordofan
I modified the walk to a find for a file name: code var find = function(dir, filename, done) { ... var path = require('path'); file = path.resolve(dir, file); ... find(file, filename, function(err, res) { .... } else { if (file.indexOf(filename)>1) { // yee haw, found it. results.push(file); done(null,results); } ... find("../","git.exe",function(err,results) { ...Antigone
I downvoted because your answer was great when you first wrote it back in 2011, but in 2014 people use open source modules and write less code themselves and contribute to the modules that they and so many other people depend on. For example try node-dir to get exactly the output required by @Discoid using this line of code: require('node-dir').files(__dirname, function(err, files) { console.log(files); });Genevagenevan
For anyone confused about the !-- syntax, a question has been asked about itCraze
is there a particular reason you used fs instead of fs.promises? wouldnt fs.promises be even better?Syzygy
Is there any chance this can get stuck in an infinite loop in the case of a symlink?Electret
I used the npm module 'directory-tree' and avoided doing this manuallyJill
@Electret yes it doesAsset
Adding a note that the highly upvoted comment's suggested node-dir package hasn't been updated in 5 years (checked on Nov 29th 2021).Lovieloving
This is the problem with NPM and maintaining packages. You have a snippet you can copy and paste, yet your going to inject a dependency that you don't need to inject into your code base. Then you publish your package on NPM, and people use your package. Meanwhile the dependency your all relying on doesn't get updated because the person he wrote it isn't paid to do so.Municipal
Old answer, needs an ESM update, but is a nice answer.Municipal
@Lovieloving adding a note that fs.readdir is broken out of the box in Node v20 while node-dir works perfectly. github.com/nodejs/node/issues/49299Bottomry
@ChristiaanWesterbeek - personally, I'd rather write my own function / library to understand how it works and keep my dependencies minimal. I use modules, but not if I only need 1% of it.Flange
L
391

This one uses the maximum amount of new, buzzwordy features available in node 8, including Promises, util/promisify, destructuring, async-await, map+reduce and more, making your co-workers scratch their heads as they try to figure out what is going on.

Node 8+

No external dependencies.

const { promisify } = require('util');
const { resolve } = require('path');
const fs = require('fs');
const readdir = promisify(fs.readdir);
const stat = promisify(fs.stat);

async function getFiles(dir) {
  const subdirs = await readdir(dir);
  const files = await Promise.all(subdirs.map(async (subdir) => {
    const res = resolve(dir, subdir);
    return (await stat(res)).isDirectory() ? getFiles(res) : res;
  }));
  return files.reduce((a, f) => a.concat(f), []);
}

Usage

getFiles(__dirname)
  .then(files => console.log(files))
  .catch(e => console.error(e));

Node 10.10+

Updated for node 10+ with even more whizbang:

const { resolve } = require('path');
const { readdir } = require('fs').promises;

async function getFiles(dir) {
  const dirents = await readdir(dir, { withFileTypes: true });
  const files = await Promise.all(dirents.map((dirent) => {
    const res = resolve(dir, dirent.name);
    return dirent.isDirectory() ? getFiles(res) : res;
  }));
  return Array.prototype.concat(...files);
}

Note that starting with node 11.15.0 you can use files.flat() instead of Array.prototype.concat(...files) to flatten the files array.

Node 11+

If you want to blow everybody's head up completely, you can use the following version using async iterators. In addition to being really cool, it also allows consumers to pull results one-at-a-time, making it better suited for really large directories.

const { resolve } = require('path');
const { readdir } = require('fs').promises;

async function* getFiles(dir) {
  const dirents = await readdir(dir, { withFileTypes: true });
  for (const dirent of dirents) {
    const res = resolve(dir, dirent.name);
    if (dirent.isDirectory()) {
      yield* getFiles(res);
    } else {
      yield res;
    }
  }
}

Usage has changed because the return type is now an async iterator instead of a promise

;(async () => {
  for await (const f of getFiles('.')) {
    console.log(f);
  }
})()

In case somebody is interested, I've written more about async iterators here: https://qwtel.com/posts/software/async-generators-in-the-wild/

Node 20+

As of Node 20, fs.readdir has a { recursive: true } option

const files = await fs.readdir(dir, { recursive: true });
Lettuce answered 16/7, 2017 at 16:42 Comment(16)
The naming of subdir and subdirs is misleading, as those may be actually files (I suggest something like itemInDir or item_in_dir or even simply item instead.), but this solution feels cleaner than the accepted one and is much less code. I also don't find it much more complicated than the code in the accepted answer. +1Taster
You could make this even more whizbang by using require(fs).promises and just drop util.promisify completely. Personally I alias fs to fs.promises.Tetrastich
We can make this faster with one small change: passing the 2nd argument to readdir AKA the options object like so readdir(dir, {withFileTypes: true}) this will return all the items with their type information, SO we won't need to call stat at all to obtain the information that readdir now gives us back. This saves us from needing to make additional sys calls. Details hereAlfonsoalfonzo
@cacoder Updated to include withFileTypes. Thanks for the tip.Lettuce
in node 10.10+, if you replace return Array.prototype.concat(...files); with let result = Array.prototype.concat(...files); return result.map(file => file.split('\\').join('/')); you can make sure the dirs return a "/" and not a "\". If you don't mind regex, you can also do return result.map(file => file.replace(/\\/g, '/'));Suspensive
How could this be modified to include a maxDepth parameter? I tried setting a currentDept counter when starting the fucntion, checking it along side the if (dirent.isDirectory(), but because this is using a depth-first approach, it isn't working.Martins
You'd pass it as a function parameter and decrement it with every recursive invocation.Lettuce
Is this known to not be able to get caught in infinite loops on symlinks?Electret
@qwtel, check my answer and add your equivalent in your answer, please.Perlite
FWIW, the Node 10.10 version has fantastic performance, compared to a handful of other approaches I've tested.Seda
originally I upvoted this, but I recently read more about async iterators. had seen them but never dove into them before. after basing several functions off of your blog post I have had my mind blown! TY TYAnthe
that async iterator example is sexy. I dig itCori
Node 14+: top-level await? (in keeping with the buzzy theme of this answer) nodejs.org/api/esm.html#enablingBergstein
"maximum amount of new, buzzwordy features available in node 8, including Promises, util/promisify, destructuring, async-await, map+reduce and more, making your co-workers scratch their heads as they try to figure out what is going on." I wish I could lodge a separate upvote for composition.Bauman
GitHub Copilot sent me here as a suggestion, granting someone random credit for the referral. I too wish to register a separate upvote for the references to exploding heads and whizbang.Atropine
FWIW, it actually looks like { recursive: true } option was added in 18.17.0, nodejs.org/docs/latest-v18.x/api/… open the historySacrum
J
150

Just in case anyone finds it useful, I also put together a synchronous version.

var walk = function(dir) {
    var results = [];
    var list = fs.readdirSync(dir);
    list.forEach(function(file) {
        file = dir + '/' + file;
        var stat = fs.statSync(file);
        if (stat && stat.isDirectory()) { 
            /* Recurse into a subdirectory */
            results = results.concat(walk(file));
        } else { 
            /* Is a file */
            results.push(file);
        }
    });
    return results;
}

Tip: To use less resources when filtering. Filter within this function itself. E.g. Replace results.push(file); with below code. Adjust as required:

    file_type = file.split(".").pop();
    file_name = file.split(/(\\|\/)/g).pop();
    if (file_type == "json") results.push(file);
Jurdi answered 22/5, 2013 at 6:2 Comment(7)
This is simple. But also a bit naive. Might cause a stackoverflow if a directory contains a link to a parent directory. Maybe use lstat instead? Or else add a recursiveness check to limit the recursivity level.Stercoricolous
Consider using file = require("path").join(dir,file)Rael
@mpen Semi-colons are redundantExcruciation
Instead of file = dir + '/' + file; file = path.join(dir, file); would be more elegantFigureground
was stellar, thank you!Scathing
Absolute winner with respect to brevityMorman
dir recursive sync version npm npmjs.com/package/fs-readdir-recursive npmjs.com/package/walk-sync npmjs.com/package/walkGuidebook
O
92

A. Have a look at the file module. It has a function called walk:

file.walk(start, callback)

Navigates a file tree, calling callback for each directory, passing in (null, dirPath, dirs, files).

This may be for you! And yes, it is async. However, I think you would have to aggregate the full path's yourself, if you needed them.

B. An alternative, and even one of my favourites: use the unix find for that. Why do something again, that has already been programmed? Maybe not exactly what you need, but still worth checking out:

var execFile = require('child_process').execFile;
execFile('find', [ 'somepath/' ], function(err, stdout, stderr) {
  var file_list = stdout.split('\n');
  /* now you've got a list with full path file names */
});

Find has a nice build-in caching mechanism that makes subsequent searches very fast, as long as only few folder have changed.

Overdo answered 15/6, 2011 at 13:44 Comment(3)
Had a question about example B: For execFile() ( and exec() ) the stderr and stdout are Buffers.. so wouldn't you need to do stdout.toString.split("\n") since Buffers are not Strings?Angora
nice, but not cross platform.Kiker
By the way: No, A is not Unix only! Only B is Unix only. However, Windows 10 now comes with a Linux subsystem. So even B would just work on Windows nowadays.Overdo
I
56

I recommend using node-glob to accomplish that task.

var glob = require( 'glob' );  

glob( 'dirname/**/*.js', function( err, files ) {
  console.log( files );
});
Interoffice answered 10/3, 2015 at 23:58 Comment(1)
A one-liner! Love it!Latricelatricia
P
44

Another nice npm package is glob.

npm install glob

It is very powerful and should cover all your recursing needs.

Edit:

I actually wasn't perfectly happy with glob, so I created readdirp.

I'm very confident that its API makes finding files and directories recursively and applying specific filters very easy.

Read through its documentation to get a better idea of what it does and install via:

npm install readdirp

Pearl answered 1/6, 2012 at 12:25 Comment(4)
Best module in my opinion. And is alike many other projects, like Grunt, Mocha, etc. and other 80'000+ other projects. Just saying.Gorlicki
Could you please expand on your reasons to create readdrip @Thorsten LorenzTalkative
What's your problem with glob?Medrano
Herein lies the problem of the npm ecosystem: far too many packages. Why reinvent the wheel? What's wrong with glob?Exculpate
P
36

Short, Modern and Efficient:

import {readdir} from 'node:fs/promises'
import {join} from 'node:path'

const walk = async (dirPath) => Promise.all(
  await readdir(dirPath, { withFileTypes: true }).then((entries) => entries.map((entry) => {
    const childPath = join(dirPath, entry.name)
    return entry.isDirectory() ? walk(childPath) : childPath
  })),
)

Special thank to Function for hinting: {withFileTypes: true}.


This automatically keeps tree-structure of the source directory (which you may need). For example if:

const allFiles = await walk('src')

then allFiles would be a TREE like this:

[
  [
    'src/client/api.js',
    'src/client/http-constants.js',
    'src/client/index.html',
    'src/client/index.js',
    [ 'src/client/res/favicon.ico' ],
    'src/client/storage.js'
  ],
  [ 'src/crypto/keygen.js' ],
  'src/discover.js',
  [
    'src/mutations/createNewMutation.js',
    'src/mutations/newAccount.js',
    'src/mutations/transferCredit.js',
    'src/mutations/updateApp.js'
  ],
  [
    'src/server/authentication.js',
    'src/server/handlers.js',
    'src/server/quick-response.js',
    'src/server/server.js',
    'src/server/static-resources.js'
  ],
  [ 'src/util/prompt.js', 'src/util/safeWriteFile.js' ],
  'src/util.js'
]

Flat it if you want:

allFiles.flat(Number.POSITIVE_INFINITY)
[
  'src/client/api.js',
  'src/client/http-constants.js',
  'src/client/index.html',
  'src/client/index.js',
  'src/client/res/favicon.ico',
  'src/client/storage.js',
  'src/crypto/keygen.js',
  'src/discover.js',
  'src/mutations/createNewMutation.js',
  'src/mutations/newAccount.js',
  'src/mutations/transferCredit.js',
  'src/mutations/updateApp.js',
  'src/server/authentication.js',
  'src/server/handlers.js',
  'src/server/quick-response.js',
  'src/server/server.js',
  'src/server/static-resources.js',
  'src/util/prompt.js',
  'src/util/safeWriteFile.js',
  'src/util.js'
]
Prole answered 17/2, 2022 at 22:24 Comment(3)
could this be improved or simplified by using dirents ({ withFileTypes: true }) which already contain the info if they're directories or not, saving the lstat step?Berar
@Function; Thank you very much. I edited and improved my answer.Prole
No need to use .flat(), just .push() at the end of the ternary: await deepReadDir(path) : list.push(path)Shroudlaid
O
17

If you want to use an npm package, wrench is pretty good.

var wrench = require("wrench");

var files = wrench.readdirSyncRecursive("directory");

wrench.readdirRecursive("directory", function (error, files) {
    // live your dreams
});

EDIT (2018):
Anyone reading through in recent time: The author deprecated this package in 2015:

wrench.js is deprecated, and hasn't been updated in quite some time. I heavily recommend using fs-extra to do any extra filesystem operations.

Oleviaolfaction answered 24/4, 2012 at 17:45 Comment(2)
@Domenic, how do you denodify this? Callback is fired multiple times (recursively). So using Q.denodify(wrench.readdirRecursive) returns only the first result.Beefcake
@OnurYıldırım yeah, this is not a good fit for promises as-is. You would need to write something that returns multiple promises, or something that waits until all subdirs are enumerated before returning a promise. For the latter, see github.com/kriskowal/q-io#listdirectorytreepathOleviaolfaction
A
13

Async

const fs = require('fs')
const path = require('path')

const readdir = (p, done, a = [], i = 0) => fs.readdir(p, (e, d = []) =>
  d.map(f => readdir(a[a.push(path.join(p, f)) - 1], () =>
    ++i == d.length && done(a), a)).length || done(a))

readdir(__dirname, console.log)

Sync

const fs = require('fs')
const path = require('path')

const readdirSync = (p, a = []) => {
  if (fs.statSync(p).isDirectory())
    fs.readdirSync(p).map(f => readdirSync(a[a.push(path.join(p, f)) - 1], a))
  return a
}

console.log(readdirSync(__dirname))

Async readable

function readdir (currentPath, done, allFiles = [], i = 0) {
  fs.readdir(currentPath, function (e, directoryFiles = []) {
    if (!directoryFiles.length)
      return done(allFiles)
    directoryFiles.map(function (file) {
      var joinedPath = path.join(currentPath, file)
      allFiles.push(joinedPath)
      readdir(joinedPath, function () {
        i = i + 1
        if (i == directoryFiles.length)
          done(allFiles)}
      , allFiles)
    })
  })
}

readdir(__dirname, console.log)

Note: both versions will follow symlinks (same as the original fs.readdir)

Anaclitic answered 8/4, 2019 at 4:4 Comment(0)
E
12

With Recursion

var fs = require('fs')
var path = process.cwd()
var files = []

var getFiles = function(path, files){
    fs.readdirSync(path).forEach(function(file){
        var subpath = path + '/' + file;
        if(fs.lstatSync(subpath).isDirectory()){
            getFiles(subpath, files);
        } else {
            files.push(path + '/' + file);
        }
    });     
}

Calling

getFiles(path, files)
console.log(files) // will log all files in directory
Evaporite answered 19/4, 2016 at 22:54 Comment(1)
I'd suggest not joining the path strings with / but using the path module: path.join(searchPath, file). That way, you will get correct paths independent of the OS.Forborne
A
10

I loved the answer from chjj above and would not have been able to create my version of the parallel loop without that start.

var fs = require("fs");

var tree = function(dir, done) {
  var results = {
        "path": dir
        ,"children": []
      };
  fs.readdir(dir, function(err, list) {
    if (err) { return done(err); }
    var pending = list.length;
    if (!pending) { return done(null, results); }
    list.forEach(function(file) {
      fs.stat(dir + '/' + file, function(err, stat) {
        if (stat && stat.isDirectory()) {
          tree(dir + '/' + file, function(err, res) {
            results.children.push(res);
            if (!--pending){ done(null, results); }
          });
        } else {
          results.children.push({"path": dir + "/" + file});
          if (!--pending) { done(null, results); }
        }
      });
    });
  });
};

module.exports = tree;

I created a Gist as well. Comments welcome. I am still starting out in the NodeJS realm so that is one way I hope to learn more.

Antiknock answered 14/9, 2012 at 0:16 Comment(0)
S
9

Vanilla ES6 + async/await + small & readable

I didn't find the answer I was looking for in this thread; there were a few similar elements spread across different answers, but I just wanted something simple and readable.

Just in case it helps anyone in the future (i.e. myself in a couple of months), this I what I ended up using:

const { readdir } = require('fs/promises');
const { join } = require('path');

const readdirRecursive = async dir => {
  const files = await readdir( dir, { withFileTypes: true } );

  const paths = files.map( async file => {
    const path = join( dir, file.name );

    if ( file.isDirectory() ) return await readdirRecursive( path );

    return path;
  } );

  return ( await Promise.all( paths ) ).flat( Infinity );
}

module.exports = {
  readdirRecursive,
}
Supergalaxy answered 4/5, 2022 at 21:30 Comment(2)
Is recursiveReaddir supposed to be readdirRecursive?Aristocracy
@Aristocracy It is... Thanks for catching that! The answer has been updated :)Supergalaxy
G
8

Use node-dir to produce exactly the output you like

var dir = require('node-dir');

dir.files(__dirname, function(err, files) {
  if (err) throw err;
  console.log(files);
  //we have an array of files now, so now we can iterate that array
  files.forEach(function(path) {
    action(null, path);
  })
});
Genevagenevan answered 14/5, 2014 at 14:46 Comment(3)
node-dir was working fine, but when I used it with webpack I have some weird issues. An  is inserted in the readFiles function as in "if (err)  { " causing an "uncaught SyntaxError: Unexpected token {" error. I am stumped by this issue and my immediate reaction is to replace node-dir with something similarMimosaceous
@Mimosaceous this comment is not going to give you answers. Write a new full question on SO or create an issue at the GitHub repository. When you elaborate well on your question, you might even be able to solve your problem without even having to post itGenevagenevan
@Parth's comment may still be a useful warning for others who are considering your suggestion as the solution to their problem. They may not have been looking for an answer in this comments section :)Entranceway
M
8

Here is a simple synchronous recursive solution

const fs = require('fs')

const getFiles = path => {
    const files = []
    for (const file of fs.readdirSync(path)) {
        const fullPath = path + '/' + file
        if(fs.lstatSync(fullPath).isDirectory())
            getFiles(fullPath).forEach(x => files.push(file + '/' + x))
        else files.push(file)
    }
    return files
}

Usage:

const files = getFiles(process.cwd())

console.log(files)

You could write it asynchronously, but there is no need. Just make sure that the input directory exists and is accessible.

Mungovan answered 28/1, 2021 at 13:57 Comment(2)
Was looking for a simple sync solution to load some config, and this met my needs. The base path in the forEach should be fullPath so that the file can be easily later read with fs.readFileSync(file) so x => files.push(fullPath + '/' + x))Klusek
Yep you can tweak it however you like, I think fs.readFileSync can understand relative paths though, so depends on where you're running it from but its likely the changes are not even required.Mungovan
P
8

Modern promise based read dir recursive version:

const fs = require('fs');
const path = require('path');

const readDirRecursive = async (filePath) => {
    const dir = await fs.promises.readdir(filePath);
    const files = await Promise.all(dir.map(async relativePath => {
        const absolutePath = path.join(filePath, relativePath);
        const stat = await fs.promises.lstat(absolutePath);

        return stat.isDirectory() ? readDirRecursive(absolutePath) : absolutePath;
    }));

    return files.flat();
}
Prato answered 24/9, 2021 at 17:0 Comment(0)
P
7

qwtel's answer variant, in TypeScript

import { resolve } from 'path';
import { readdir } from 'fs/promises';

async function* getFiles(dir: string): AsyncGenerator<string> {
    const entries = await readdir(dir, { withFileTypes: true });
    for (const entry of entries) {
        const res = resolve(dir, entry.name);
        if (entry.isDirectory()) {
            yield* getFiles(res);
        } else {
            yield res;
        }
    }
}
Perlite answered 22/12, 2020 at 20:3 Comment(0)
E
6

Simple, Async Promise Based


const fs = require('fs/promises');
const getDirRecursive = async (dir) => {
    try {
        const items = await fs.readdir(dir);
        let files = [];
        for (const item of items) {
            if ((await fs.lstat(`${dir}/${item}`)).isDirectory()) files = [...files, ...(await getDirRecursive(`${dir}/${item}`))];
            else files.push({file: item, path: `${dir}/${item}`, parents: dir.split("/")});
        }
        return files;
    } catch (e) {
        return e
    }
};

Usage: await getDirRecursive("./public");

Enshroud answered 17/2, 2021 at 8:19 Comment(0)
K
6

Shortest native solution which is available with v20.1 release:

import fs from 'node:fs'

const results = fs.promises.readdir('/tmp', { recursive: true })

recursive option is also supported by fs.readdir and fs.readdirSync functions.

Keeleykeelhaul answered 30/5, 2023 at 8:53 Comment(0)
I
5

Using async/await, this should work:

const FS = require('fs');
const readDir = promisify(FS.readdir);
const fileStat = promisify(FS.stat);

async function getFiles(dir) {
    let files = await readDir(dir);

    let result = files.map(file => {
        let path = Path.join(dir,file);
        return fileStat(path).then(stat => stat.isDirectory() ? getFiles(path) : path);
    });

    return flatten(await Promise.all(result));
}

function flatten(arr) {
    return Array.prototype.concat(...arr);
}

You can use bluebird.Promisify or this:

/**
 * Returns a function that will wrap the given `nodeFunction`. Instead of taking a callback, the returned function will return a promise whose fate is decided by the callback behavior of the given node function. The node function should conform to node.js convention of accepting a callback as last argument and calling that callback with error as the first argument and success value on the second argument.
 *
 * @param {Function} nodeFunction
 * @returns {Function}
 */
module.exports = function promisify(nodeFunction) {
    return function(...args) {
        return new Promise((resolve, reject) => {
            nodeFunction.call(this, ...args, (err, data) => {
                if(err) {
                    reject(err);
                } else {
                    resolve(data);
                }
            })
        });
    };
};

Node 8+ has Promisify built-in

See my other answer for a generator approach that can give results even faster.

Indent answered 23/2, 2017 at 0:39 Comment(0)
M
4

I've coded this recently, and thought it would make sense to share this here. The code makes use of the async library.

var fs = require('fs');
var async = require('async');

var scan = function(dir, suffix, callback) {
  fs.readdir(dir, function(err, files) {
    var returnFiles = [];
    async.each(files, function(file, next) {
      var filePath = dir + '/' + file;
      fs.stat(filePath, function(err, stat) {
        if (err) {
          return next(err);
        }
        if (stat.isDirectory()) {
          scan(filePath, suffix, function(err, results) {
            if (err) {
              return next(err);
            }
            returnFiles = returnFiles.concat(results);
            next();
          })
        }
        else if (stat.isFile()) {
          if (file.indexOf(suffix, file.length - suffix.length) !== -1) {
            returnFiles.push(filePath);
          }
          next();
        }
      });
    }, function(err) {
      callback(err, returnFiles);
    });
  });
};

You can use it like this:

scan('/some/dir', '.ext', function(err, files) {
  // Do something with files that ends in '.ext'.
  console.log(files);
});
Methodism answered 8/4, 2013 at 15:15 Comment(1)
This. This is so tidy and simple to use. I pumped it out into a module, required it and it works like a mcdream sandwich.Zirconia
W
4

A library called Filehound is another option. It will recursively search a given directory (working directory by default). It supports various filters, callbacks, promises and sync searches.

For example, search the current working directory for all files (using callbacks):

const Filehound = require('filehound');

Filehound.create()
.find((err, files) => {
    if (err) {
        return console.error(`error: ${err}`);
    }
    console.log(files); // array of files
});

Or promises and specifying a specific directory:

const Filehound = require('filehound');

Filehound.create()
.paths("/tmp")
.find()
.each(console.log);

Consult the docs for further use cases and examples of usage: https://github.com/nspragg/filehound

Disclaimer: I'm the author.

Wastebasket answered 14/11, 2016 at 19:28 Comment(0)
H
3

Check out the final-fs library. It provides a readdirRecursive function:

ffs.readdirRecursive(dirPath, true, 'my/initial/path')
    .then(function (files) {
        // in the `files` variable you've got all the files
    })
    .otherwise(function (err) {
        // something went wrong
    });
Hobnailed answered 5/6, 2013 at 23:37 Comment(0)
Z
2

Standalone promise implementation

I am using the when.js promise library in this example.

var fs = require('fs')
, path = require('path')
, when = require('when')
, nodefn = require('when/node/function');

function walk (directory, includeDir) {
    var results = [];
    return when.map(nodefn.call(fs.readdir, directory), function(file) {
        file = path.join(directory, file);
        return nodefn.call(fs.stat, file).then(function(stat) {
            if (stat.isFile()) { return results.push(file); }
            if (includeDir) { results.push(file + path.sep); }
            return walk(file, includeDir).then(function(filesInDir) {
                results = results.concat(filesInDir);
            });
        });
    }).then(function() {
        return results;
    });
};

walk(__dirname).then(function(files) {
    console.log(files);
}).otherwise(function(error) {
    console.error(error.stack || error);
});

I've included an optional parameter includeDir which will include directories in the file listing if set to true.

Zirconia answered 20/9, 2013 at 21:32 Comment(0)
S
2

The recursive-readdir module has this functionality.

Succinylsulfathiazole answered 25/11, 2014 at 0:19 Comment(0)
D
2

klaw and klaw-sync are worth considering for this sort of thing. These were part of node-fs-extra.

Doolittle answered 23/1, 2017 at 15:40 Comment(0)
I
2

For Node 10.3+, here is a for-await solution:

#!/usr/bin/env node

const FS = require('fs');
const Util = require('util');
const readDir = Util.promisify(FS.readdir);
const Path = require('path');

async function* readDirR(path) {
    const entries = await readDir(path,{withFileTypes:true});
    for(let entry of entries) {
        const fullPath = Path.join(path,entry.name);
        if(entry.isDirectory()) {
            yield* readDirR(fullPath);
        } else {
            yield fullPath;
        }
    }
}

async function main() {
    const start = process.hrtime.bigint();
    for await(const file of readDirR('/mnt/home/media/Unsorted')) {
        console.log(file);
    }
    console.log((process.hrtime.bigint()-start)/1000000n);
}

main().catch(err => {
    console.error(err);
});

The benefit of this solution is that you can start processing the results immediately; e.g. it takes 12 seconds to read all the files in my media directory, but if I do it this way I can get the first result within a few milliseconds.

Indent answered 4/3, 2019 at 7:39 Comment(0)
F
1

Here's yet another implementation. None of the above solutions have any limiters, and so if your directory structure is large, they're all going to thrash and eventually run out of resources.

var async = require('async');
var fs = require('fs');
var resolve = require('path').resolve;

var scan = function(path, concurrency, callback) {
    var list = [];

    var walker = async.queue(function(path, callback) {
        fs.stat(path, function(err, stats) {
            if (err) {
                return callback(err);
            } else {
                if (stats.isDirectory()) {
                    fs.readdir(path, function(err, files) {
                        if (err) {
                            callback(err);
                        } else {
                            for (var i = 0; i < files.length; i++) {
                                walker.push(resolve(path, files[i]));
                            }
                            callback();
                        }
                    });
                } else {
                    list.push(path);
                    callback();
                }
            }
        });
    }, concurrency);

    walker.push(path);

    walker.drain = function() {
        callback(list);
    }
};

Using a concurrency of 50 works pretty well, and is almost as fast as simpler implementations for small directory structures.

Foust answered 6/8, 2014 at 16:7 Comment(0)
G
1

I modified Trevor Senior's Promise based answer to work with Bluebird

var fs = require('fs'),
    path = require('path'),
    Promise = require('bluebird');

var readdirAsync = Promise.promisify(fs.readdir);
var statAsync = Promise.promisify(fs.stat);
function walkFiles (directory) {
    var results = [];
    return readdirAsync(directory).map(function(file) {
        file = path.join(directory, file);
        return statAsync(file).then(function(stat) {
            if (stat.isFile()) {
                return results.push(file);
            }
            return walkFiles(file).then(function(filesInDir) {
                results = results.concat(filesInDir);
            });
        });
    }).then(function() {
        return results;
    });
}

//use
walkDir(__dirname).then(function(files) {
    console.log(files);
}).catch(function(e) {
    console.error(e); {
});
Gaudery answered 24/1, 2015 at 21:51 Comment(0)
C
1

For fun, here is a flow based version that works with highland.js streams library. It was co-authored by Victor Vu.

###
  directory >---m------> dirFilesStream >---------o----> out
                |                                 |
                |                                 |
                +--------< returnPipe <-----------+

  legend: (m)erge  (o)bserve

 + directory         has the initial file
 + dirListStream     does a directory listing
 + out               prints out the full path of the file
 + returnPipe        runs stat and filters on directories

###

_ = require('highland')
fs = require('fs')
fsPath = require('path')

directory = _(['someDirectory'])
mergePoint = _()
dirFilesStream = mergePoint.merge().flatMap((parentPath) ->
  _.wrapCallback(fs.readdir)(parentPath).sequence().map (path) ->
    fsPath.join parentPath, path
)
out = dirFilesStream
# Create the return pipe
returnPipe = dirFilesStream.observe().flatFilter((path) ->
  _.wrapCallback(fs.stat)(path).map (v) ->
    v.isDirectory()
)
# Connect up the merge point now that we have all of our streams.
mergePoint.write directory
mergePoint.write returnPipe
mergePoint.end()
# Release backpressure.  This will print files as they are discovered
out.each H.log
# Another way would be to queue them all up and then print them all out at once.
# out.toArray((files)-> console.log(files))
Calc answered 10/9, 2015 at 16:8 Comment(0)
F
1

Using Promises (Q) to solve this in a Functional style:

var fs = require('fs'),
    fsPath = require('path'),
    Q = require('q');

var walk = function (dir) {
  return Q.ninvoke(fs, 'readdir', dir).then(function (files) {

    return Q.all(files.map(function (file) {

      file = fsPath.join(dir, file);
      return Q.ninvoke(fs, 'lstat', file).then(function (stat) {

        if (stat.isDirectory()) {
          return walk(file);
        } else {
          return [file];
        }
      });
    }));
  }).then(function (files) {
    return files.reduce(function (pre, cur) {
      return pre.concat(cur);
    });
  });
};

It returns a promise of an array, so you can use it as:

walk('/home/mypath').then(function (files) { console.log(files); });
Forbid answered 25/11, 2015 at 13:18 Comment(0)
M
1

I must add the Promise-based sander library to the list.

 var sander = require('sander');
 sander.lsr(directory).then( filenames => { console.log(filenames) } );
Mohave answered 14/1, 2016 at 12:58 Comment(0)
P
1

Using bluebird promise.coroutine:

let promise = require('bluebird'),
    PC = promise.coroutine,
    fs = promise.promisifyAll(require('fs'));
let getFiles = PC(function*(dir){
    let files = [];
    let contents = yield fs.readdirAsync(dir);
    for (let i = 0, l = contents.length; i < l; i ++) {
        //to remove dot(hidden) files on MAC
        if (/^\..*/.test(contents[i])) contents.splice(i, 1);
    }
    for (let i = 0, l = contents.length; i < l; i ++) {
        let content = path.resolve(dir, contents[i]);
        let contentStat = yield fs.statAsync(content);
        if (contentStat && contentStat.isDirectory()) {
            let subFiles = yield getFiles(content);
            files = files.concat(subFiles);
        } else {
            files.push(content);
        }
    }
    return files;
});
//how to use
//easy error handling in one place
getFiles(your_dir).then(console.log).catch(err => console.log(err));
Pennant answered 24/8, 2016 at 8:34 Comment(0)
F
1

Yet another answer, but this time using TypeScript:

/**
 * Recursively walk a directory asynchronously and obtain all file names (with full path).
 *
 * @param dir Folder name you want to recursively process
 * @param done Callback function, returns all files with full path.
 * @param filter Optional filter to specify which files to include, 
 *   e.g. for json files: (f: string) => /.json$/.test(f)
 */
const walk = (
  dir: string,
  done: (err: Error | null, results ? : string[]) => void,
  filter ? : (f: string) => boolean
) => {
  let results: string[] = [];
  fs.readdir(dir, (err: Error, list: string[]) => {
    if (err) {
      return done(err);
    }
    let pending = list.length;
    if (!pending) {
      return done(null, results);
    }
    list.forEach((file: string) => {
      file = path.resolve(dir, file);
      fs.stat(file, (err2, stat) => {
        if (stat && stat.isDirectory()) {
          walk(file, (err3, res) => {
            if (res) {
              results = results.concat(res);
            }
            if (!--pending) {
              done(null, results);
            }
          }, filter);
        } else {
          if (typeof filter === 'undefined' || (filter && filter(file))) {
            results.push(file);
          }
          if (!--pending) {
            done(null, results);
          }
        }
      });
    });
  });
};
Fortenberry answered 15/5, 2018 at 8:20 Comment(0)
B
1

Promise based recursive solution in TypeScript using Array.flat() for handling nested returns.

import { resolve } from 'path'
import { Dirent } from 'fs'
import * as fs from 'fs'

function getFiles(root: string): Promise<string[]> {
 return fs.promises
   .readdir(root, { withFileTypes: true })
   .then(dirents => {
      const mapToPath = (r: string) => (dirent: Dirent): string => resolve(r, dirent.name)
      const directoryPaths = dirents.filter(a => a.isDirectory()).map(mapToPath(root))
      const filePaths = dirents.filter(a => a.isFile()).map(mapToPath(root))

     return Promise.all<string>([
       ...directoryPaths.map(a => getFiles(a, include)).flat(),
       ...filePaths.map(a => Promise.resolve(a))
     ]).then(a => a.flat())
  })
}
Biceps answered 9/3, 2020 at 6:30 Comment(0)
S
1

Whoever wants a synchronous alternative to the accepted answer (I know I did):

var fs = require('fs');
var path = require('path');
var walk = function(dir) {
    let results = [], err = null, list;
    try {
        list = fs.readdirSync(dir)
    } catch(e) {
        err = e.toString();
    }
    if (err) return err;
    var i = 0;
    return (function next() {
        var file = list[i++];

        if(!file) return results;
        file = path.resolve(dir, file);
        let stat = fs.statSync(file);
        if (stat && stat.isDirectory()) {
          let res = walk(file);
          results = results.concat(res);
          return next();
        } else {
          results.push(file);
           return next();
        }

    })();

};

console.log(
    walk("./")
)
Supposal answered 31/3, 2020 at 8:49 Comment(0)
O
1

There is a new module called cup-readdir that recursively searches directories very fast. It uses asynchronous promises and outperforms many popular modules when dealing with deep directory structures.

It can return all files in an array and sort them by their properties, but lacks features like file filtering and entering symlinked directories. This could be useful for large projects where you simply want to get every file from a directory. Here is a link to their project homepage.

Obedient answered 19/1, 2021 at 23:41 Comment(0)
E
1

One more approach. I just leave it here. May be it will be useful for someone in future.

const fs = require("fs");
const { promisify } = require("util");
const p = require("path");
const readdir = promisify(fs.readdir);

async function getFiles(path) {
  try {
    const entries = await readdir(path, { withFileTypes: true });

    const files = entries
      .filter((file) => !file.isDirectory())
      .map((file) => ({
        path: `${path}/${file.name}`,
        ext: p.extname(`${path}/${file.name}`),
        pathDir: path,
      }));

    const folders = entries.filter((folder) => folder.isDirectory());

    for (const folder of folders) {
      files.push(...(await getFiles(`${path}/${folder.name}`)));
    }
    return files;
  } catch (error) {
    return error;
  }
}

Usage:

getFiles(rootFolderPath)
 .then()
 .catch()

Epistyle answered 28/3, 2022 at 13:41 Comment(0)
S
0

Tested in NodeJS v21

async function listDirRecursive(pathToDir, onlyFiles, onlyDir){
    let result = await fs.readdir(pathToDir,{withFileTypes:true, recursive:true})
    if(onlyFiles)
    result = result.filter(e=>e.isFile())
    else if(onlyDir)
    result = result.filter(e=>e.isDirectory())
 
    return result.map(e=>path.join(e.path,e.name))
 
 }
Shaquitashara answered 29/4, 2011 at 3:44 Comment(0)
S
0

Because everyone should write his own, I made one.

walk(dir, cb, endCb) cb(file) endCb(err | null)

DIRTY

module.exports = walk;

function walk(dir, cb, endCb) {
  var fs = require('fs');
  var path = require('path');

  fs.readdir(dir, function(err, files) {
    if (err) {
      return endCb(err);
    }

    var pending = files.length;
    if (pending === 0) {
      endCb(null);
    }
    files.forEach(function(file) {
      fs.stat(path.join(dir, file), function(err, stats) {
        if (err) {
          return endCb(err)
        }

        if (stats.isDirectory()) {
          walk(path.join(dir, file), cb, function() {
            pending--;
            if (pending === 0) {
              endCb(null);
            }
          });
        } else {
          cb(path.join(dir, file));
          pending--;
          if (pending === 0) {
            endCb(null);
          }
        }
      })
    });

  });
}
Singularize answered 5/3, 2013 at 16:11 Comment(0)
R
0

check out loaddir https://npmjs.org/package/loaddir

npm install loaddir

  loaddir = require('loaddir')

  allJavascripts = []
  loaddir({
    path: __dirname + '/public/javascripts',
    callback: function(){  allJavascripts.push(this.relativePath + this.baseName); }
  })

You can use fileName instead of baseName if you need the extension as well.

An added bonus is that it will watch the files as well and call the callback again. There are tons of configuration options to make it extremely flexible.

I just remade the guard gem from ruby using loaddir in a short while

Rambling answered 15/8, 2013 at 14:31 Comment(0)
A
0

This is my answer. Hope it can help somebody.

My focus is to make the searching routine can stop at anywhere, and for a file found, tells the relative depth to the original path.

var _fs = require('fs');
var _path = require('path');
var _defer = process.nextTick;

// next() will pop the first element from an array and return it, together with
// the recursive depth and the container array of the element. i.e. If the first
// element is an array, it'll be dug into recursively. But if the first element is
// an empty array, it'll be simply popped and ignored.
// e.g. If the original array is [1,[2],3], next() will return [1,0,[[2],3]], and
// the array becomes [[2],3]. If the array is [[[],[1,2],3],4], next() will return
// [1,2,[2]], and the array becomes [[[2],3],4].
// There is an infinity loop `while(true) {...}`, because I optimized the code to
// make it a non-recursive version.
var next = function(c) {
    var a = c;
    var n = 0;
    while (true) {
        if (a.length == 0) return null;
        var x = a[0];
        if (x.constructor == Array) {
            if (x.length > 0) {
                a = x;
                ++n;
            } else {
                a.shift();
                a = c;
                n = 0;
            }
        } else {
            a.shift();
            return [x, n, a];
        }
    }
}

// cb is the callback function, it have four arguments:
//    1) an error object if any exception happens;
//    2) a path name, may be a directory or a file;
//    3) a flag, `true` means directory, and `false` means file;
//    4) a zero-based number indicates the depth relative to the original path.
// cb should return a state value to tell whether the searching routine should
// continue: `true` means it should continue; `false` means it should stop here;
// but for a directory, there is a third state `null`, means it should do not
// dig into the directory and continue searching the next file.
var ls = function(path, cb) {
    // use `_path.resolve()` to correctly handle '.' and '..'.
    var c = [ _path.resolve(path) ];
    var f = function() {
        var p = next(c);
        p && s(p);
    };
    var s = function(p) {
        _fs.stat(p[0], function(err, ss) {
            if (err) {
                // use `_defer()` to turn a recursive call into a non-recursive call.
                cb(err, p[0], null, p[1]) && _defer(f);
            } else if (ss.isDirectory()) {
                var y = cb(null, p[0], true, p[1]);
                if (y) r(p);
                else if (y == null) _defer(f);
            } else {
                cb(null, p[0], false, p[1]) && _defer(f);
            }
        });
    };
    var r = function(p) {
        _fs.readdir(p[0], function(err, files) {
            if (err) {
                cb(err, p[0], true, p[1]) && _defer(f);
            } else {
                // not use `Array.prototype.map()` because we can make each change on site.
                for (var i = 0; i < files.length; i++) {
                    files[i] = _path.join(p[0], files[i]);
                }
                p[2].unshift(files);
                _defer(f);
            }
        });
    }
    _defer(f);
};

var printfile = function(err, file, isdir, n) {
    if (err) {
        console.log('-->   ' + ('[' + n + '] ') + file + ': ' + err);
        return true;
    } else {
        console.log('... ' + ('[' + n + '] ') + (isdir ? 'D' : 'F') + ' ' + file);
        return true;
    }
};

var path = process.argv[2];
ls(path, printfile);
Anility answered 15/9, 2014 at 0:38 Comment(0)
A
0

Here's a recursive method of getting all files including subdirectories.

const FileSystem = require("fs");
const Path = require("path");

//...

function getFiles(directory) {
    directory = Path.normalize(directory);
    let files = FileSystem.readdirSync(directory).map((file) => directory + Path.sep + file);

    files.forEach((file, index) => {
        if (FileSystem.statSync(file).isDirectory()) {
            Array.prototype.splice.apply(files, [index, 1].concat(getFiles(file)));
        }
    });

    return files;
}
Armipotent answered 3/2, 2017 at 4:51 Comment(0)
L
0

Another simple and helpful one

function walkDir(root) {
    const stat = fs.statSync(root);

    if (stat.isDirectory()) {
        const dirs = fs.readdirSync(root).filter(item => !item.startsWith('.'));
        let results = dirs.map(sub => walkDir(`${root}/${sub}`));
        return [].concat(...results);
    } else {
        return root;
    }
}
Lali answered 24/2, 2017 at 14:58 Comment(1)
You are assuming that every file in the root directory is a folder here.Spectral
S
0

This is how I use the nodejs fs.readdir function to recursively search a directory.

const fs = require('fs');
const mime = require('mime-types');
const readdirRecursivePromise = path => {
    return new Promise((resolve, reject) => {
        fs.readdir(path, (err, directoriesPaths) => {
            if (err) {
                reject(err);
            } else {
                if (directoriesPaths.indexOf('.DS_Store') != -1) {
                    directoriesPaths.splice(directoriesPaths.indexOf('.DS_Store'), 1);
                }
                directoriesPaths.forEach((e, i) => {
                    directoriesPaths[i] = statPromise(`${path}/${e}`);
                });
                Promise.all(directoriesPaths).then(out => {
                    resolve(out);
                }).catch(err => {
                    reject(err);
                });
            }
        });
    });
};
const statPromise = path => {
    return new Promise((resolve, reject) => {
        fs.stat(path, (err, stats) => {
            if (err) {
                reject(err);
            } else {
                if (stats.isDirectory()) {
                    readdirRecursivePromise(path).then(out => {
                        resolve(out);
                    }).catch(err => {
                        reject(err);
                    });
                } else if (stats.isFile()) {
                    resolve({
                        'path': path,
                        'type': mime.lookup(path)
                    });
                } else {
                    reject(`Error parsing path: ${path}`);
                }
            }
        });
    });
};
const flatten = (arr, result = []) => {
    for (let i = 0, length = arr.length; i < length; i++) {
        const value = arr[i];
        if (Array.isArray(value)) {
            flatten(value, result);
        } else {
            result.push(value);
        }
    }
    return result;
};

Let's say you have a path called '/database' in your node projects root. Once this promise is resolved, it should spit out an array of every file under '/database'.

readdirRecursivePromise('database').then(out => {
    console.log(flatten(out));
}).catch(err => {
    console.log(err);
});
Sawhorse answered 3/5, 2017 at 8:13 Comment(0)
C
0

just a simple walk

let pending = [baseFolderPath]
function walk () {
    pending.shift();
    // do stuffs width pending[0] and change pending items
    if (pending[0]) walk(pending[0])
}
walk(pending[0])
Careaga answered 3/4, 2020 at 6:18 Comment(0)
I
0

well, this is

  • short
  • readable
  • no external libs
  • no explicit stat
  • no extra arguments
  • flat list
export const files = async (directory: string) => await
  (await readdir(directory, { withFileTypes: true }))
  .reduce(async (_, o) => {
      var _path = path.join(directory, o.name)
      return [...await _, ...o.isDirectory() ? (await files(_path)) : [_path]]
    },
    Promise.resolve([])
  )

use:

var myfiles: string[] = await files('/home/toddmo/Pictures')
Irrepealable answered 23/2, 2023 at 21:7 Comment(0)
R
0

You can use this npm module:

npm dree

It walks through all the directory tree of that object and returns it as a string or as an object. Using its file callback will allow you to reach your goal.

Example:

const dree = require('dree');
const options = {
    followLinks: true,               // If you want to follow the folders pointed by symbolic links
    depth: 5,                        // If you want to stop after 5 directory levels
    exclude: /dir_to_exclude/,       // If you want to exclude some pahts with a regexp
    extensions: [ 'txt', 'jpg' ]     // If you want only some extensions
};

const paths = [];
const fileCallback = function (file) {
    paths.push(file.relativePath);
};

let tree;
// Do it synchronously
tree = dree.scan('./dir', options, fileCallback);

// Do it asynchronously (returns promise)
tree = await dree.scanAsync('./dir', options, fileCallback);

// Now paths contains the paths you want
console.log(paths);

// tree contains an object representing the directory tree (filtered in base of the conditions)

Note that if you use await, this code should be included in an async function. A promise is returned, so you can use the .then() method.

Rochette answered 7/2 at 16:31 Comment(0)
E
0

The simplest is to use fast-glob

npm install fast-glob

dir/**/*.?? — matches all files in the 'dir' directory (any level of nesting)

const getAllFiles = (dirPath, extension = '*.??') => 
  return fg([`${dirPath}/**/${extension}`], { dot: true });
};

const getAllTxtFiles = async () => {
  const files = await getAllFiles(dirPath, '*.txt');
  console.log('Found files:', files);
}

Elishaelision answered 6/3 at 16:45 Comment(0)
V
-1

here is the complete working code. As per your requirement. you can get all files and folders recursively.

var recur = function(dir) {
            fs.readdir(dir,function(err,list){
                list.forEach(function(file){
                    var file2 = path.resolve(dir, file);
                    fs.stat(file2,function(err,stats){
                        if(stats.isDirectory()) {
                            recur(file2);
                        }
                        else {
                            console.log(file2);
                        }
                    })
                })
            });
        };
        recur(path);

in path give your directory path in which you want to search like "c:\test"

Vashtee answered 9/7, 2016 at 5:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.