How do you get a list of the names of all files present in a directory in Node.js?
Asked Answered
T

33

1608

I'm trying to get a list of the names of all the files present in a directory using Node.js. I want output that is an array of filenames. How can I do this?

Taneka answered 28/4, 2010 at 6:10 Comment(8)
fs.readdir works, but cannot use file name glob patterns like ls /tmp/*core*. Check out github.com/isaacs/node-glob. Globs can even search in sub-directories.Koala
Checkout NPM's readdir-recursive module though if you're looking for the names of files in subdirectories alsoOblivion
es7 method with await hereToolis
fs.readdir is a simple async solution - examples hereTimoteo
Still not answer using an iterator? I've 2.5 millions of files to scan… I do not want to get a list of 2.5m of path after 10 minutes.Ligneous
@FlavienVolken, you are looking for nodejs.org/api/fs.html#fs_dir_readHidalgo
@Hidalgo #56299494Ligneous
Year 2022 - Read the documentation: nodejs.org/api/fs.html#fspromisesreaddirpath-optionsNarcotize
R
2151

You can use the fs.readdir or fs.readdirSync methods. fs is included in Node.js core, so there's no need to install anything.

fs.readdir

const testFolder = './tests/';
const fs = require('fs');

fs.readdir(testFolder, (err, files) => {
  files.forEach(file => {
    console.log(file);
  });
});

fs.readdirSync

const testFolder = './tests/';
const fs = require('fs');

fs.readdirSync(testFolder).forEach(file => {
  console.log(file);
});

The difference between the two methods, is that the first one is asynchronous, so you have to provide a callback function that will be executed when the read process ends.

The second is synchronous, it will return the file name array, but it will stop any further execution of your code until the read process ends.

Reconsider answered 28/4, 2010 at 6:15 Comment(10)
Note: readdir also shows directory names. To filter these, use fs.stat(path, callback(err, stats)) and stats.isDirectory().Nolpros
I should add that most probably you should go with readdire because you dont want to block IO in node.Rademacher
@user3705055 unless you're using gulp to read in a directory of source order dependant files and compile them into a single executable.Puebla
For the newer promise method see my answer.Toolis
I'm confused... Wouldn't it be better to use ls or dir /b/s for this job? Would have thought these methods would be much faster than iterating in Node...Gilley
@Gilley You want to try parsing the output of ls? Just wait until somebody creates some filenames with embedded spaces and newlines…Turbary
@RadonRosborough yeah recently found out that ls isn't really good for file lists. But find is pretty good at it :)Gilley
As of Node v10.10.0, a combination of the withFileTypes option for the readdir and readdirSync functions and the isDirectory() method can be used to filter just the files in the directory - docs and an example hereConcierge
How would you do this in Typescript? I get this error when I try to do in typescript: "TypeError: fs.readdir is not a function" Any help is appreciated.Blinking
In the readdirSync I push the file into my array I defined in first line of the class. In the readdirSync the array size is 1 but, out of that scope, the size is 0!Barbra
G
276

IMO the most convenient way to do such tasks is to use a glob tool. Here's a glob package for node.js. Install with

npm install glob

Then use wild card to match filenames (example taken from package's website)

var glob = require("glob")

// options is optional
glob("**/*.js", options, function (er, files) {
  // files is an array of filenames.
  // If the `nonull` option is set, and nothing
  // was found, then files is ["**/*.js"]
  // er is an error object or null.
})

If you are planning on using globby here is an example to look for any xml files that are under current folder

var globby = require('globby');

const paths = await globby("**/*.xml");  
Goingover answered 30/8, 2014 at 7:57 Comment(6)
this was the best solution for me as i wanted to specify filetype easier than string comparisons. Thanks.Seward
I like this one too just because globbing is almost a fundamental skill in node. If you want to just get filenames back, pass in a cwd in the options object.Pelkey
How can get the results of glob outside of itself? Eg. I want to console.log the results, but not inside glob()?Ferraro
@Lanti: The glob.sync(pattern, [options]) method may be easier to use as it simply returns an array of file names, rather than using a callback. More info here: github.com/isaacs/node-globCompensation
For people like me looking for a glob implementation using Promises, check out globby by sindresorhus: github.com/sindresorhus/globbySurrejoinder
I have updated the answer with @NachoColoma coment and showing how to use itBadajoz
C
206

As of Node v10.10.0, it is possible to use the new withFileTypes option for fs.readdir and fs.readdirSync in combination with the dirent.isDirectory() function to filter for filenames in a directory. That looks like this:

fs.readdirSync('./dirpath', {withFileTypes: true})
.filter(item => !item.isDirectory())
.map(item => item.name)

The returned array is in the form:

['file1.txt', 'file2.txt', 'file3.txt']
Concierge answered 30/1, 2019 at 11:50 Comment(3)
this is what people are searching for in 2020 - should be "pinned"Fallacious
And 2022 as well!Ezara
Excellent, this answers the question with regards to names of the filesAllowable
G
192

The answer above does not perform a recursive search into the directory though. Here's what I did for a recursive search (using node-walk: npm install walk)

var walk    = require('walk');
var files   = [];

// Walker options
var walker  = walk.walk('./test', { followLinks: false });

walker.on('file', function(root, stat, next) {
    // Add this file to the list of files
    files.push(root + '/' + stat.name);
    next();
});

walker.on('end', function() {
    console.log(files);
});
Gombosi answered 27/4, 2011 at 7:35 Comment(9)
fs.readdirSync is better, native alternative created specially for this.Soren
fs.readdirSync doesn't walk into sub directories unfortunately, unless you are willing to write your own routine to do just that, which you don't given that there are already npm modules out there to solve this very problem.Gombosi
Here is a link to the walk github repo + docs: github.com/coolaj86/node-walkKafiristan
OP did not ask about which API does a recursive read. In any case, the accepted answer provides what can also serve as a basis for making a recursive read.Isomorph
This is a fantastic function. Quick question: is there a quick way to ignore certain dirs? I want to ignore directories starting with .gitFir
For the newer ES7 method see my answer.Toolis
> unless you are willing to write your own routine to do just that, which you don't given that there are already npm modules out there to solve this very problem Yeah, you don't want to write left-pad yourself too when there's a package for that! :/Castorena
I like this one.Salade
@RubenTan How to check the filetype? Cause it lists hidden files too. On mac: .DS_Store, for example.Barbra
J
129

Get files in all subdirs

const fs=require('fs');

function getFiles (dir, files_){
    files_ = files_ || [];
    var files = fs.readdirSync(dir);
    for (var i in files){
        var name = dir + '/' + files[i];
        if (fs.statSync(name).isDirectory()){
            getFiles(name, files_);
        } else {
            files_.push(name);
        }
    }
    return files_;
}

console.log(getFiles('path/to/dir'))
Journeywork answered 11/12, 2013 at 17:25 Comment(5)
Why if (typeof files_ === 'undefined') files_=[];? you only need to do var files_ = files_ || []; instead of files_ = files_ || [];.Gaven
You forgot to add var fs = require('fs'); at the start of getFiles.Might
This is a recursive method. It does not support very deep folder structures, which will result in a Stack Overflow.Filicide
@MathiasLykkegaardLorenzen If you've got a file system nested 11k directories deep you've probably got a lot of other things to worry about :pThatch
It doesn't have to be 11k. It depends on how much is put on the stack, and this method has quite large allocations to the stack.Filicide
S
79

Here's a simple solution using only the native fs and path modules:

// sync version
function walkSync(currentDirPath, callback) {
    var fs = require('fs'),
        path = require('path');
    fs.readdirSync(currentDirPath).forEach(function (name) {
        var filePath = path.join(currentDirPath, name);
        var stat = fs.statSync(filePath);
        if (stat.isFile()) {
            callback(filePath, stat);
        } else if (stat.isDirectory()) {
            walkSync(filePath, callback);
        }
    });
}

or async version (uses fs.readdir instead):

// async version with basic error handling
function walk(currentDirPath, callback) {
    var fs = require('fs'),
        path = require('path');
    fs.readdir(currentDirPath, function (err, files) {
        if (err) {
            throw new Error(err);
        }
        files.forEach(function (name) {
            var filePath = path.join(currentDirPath, name);
            var stat = fs.statSync(filePath);
            if (stat.isFile()) {
                callback(filePath, stat);
            } else if (stat.isDirectory()) {
                walk(filePath, callback);
            }
        });
    });
}

Then you just call (for sync version):

walkSync('path/to/root/dir', function(filePath, stat) {
    // do something with "filePath"...
});

or async version:

walk('path/to/root/dir', function(filePath, stat) {
    // do something with "filePath"...
});

The difference is in how node blocks while performing the IO. Given that the API above is the same, you could just use the async version to ensure maximum performance.

However there is one advantage to using the synchronous version. It is easier to execute some code as soon as the walk is done, as in the next statement after the walk. With the async version, you would need some extra way of knowing when you are done. Perhaps creating a map of all paths first, then enumerating them. For simple build/util scripts (vs high performance web servers) you could use the sync version without causing any damage.

Stool answered 3/2, 2015 at 0:54 Comment(2)
Should replace the line in walkSync from walk(filePath, callback); to walkSync(filePath, callback);Gait
But you're still using fs.statSync, which blocks, in async version. Shouldn't you be using fs.stat instead?Citriculture
T
28

Using Promises with ES7

Asynchronous use with mz/fs

The mz module provides promisified versions of the core node library. Using them is simple. First install the library...

npm install mz

Then...

const fs = require('mz/fs');
fs.readdir('./myDir').then(listing => console.log(listing))
  .catch(err => console.error(err));

Alternatively you can write them in asynchronous functions in ES7:

async function myReaddir () {
  try {
    const file = await fs.readdir('./myDir/');
  }
  catch (err) { console.error( err ) }
};

Update for recursive listing

Some of the users have specified a desire to see a recursive listing (though not in the question)... Use fs-promise. It's a thin wrapper around mz.

npm install fs-promise;

then...

const fs = require('fs-promise');
fs.walk('./myDir').then(
    listing => listing.forEach(file => console.log(file.path))
).catch(err => console.error(err));
Toolis answered 30/5, 2016 at 18:48 Comment(1)
fs.walk is removed from fs-promise as it it not supported by fs ( github.com/kevinbeaty/fs-promise/issues/28 )Streit
E
24

non-recursive version

You don't say you want to do it recursively so I assume you only need direct children of the directory.

Sample code:

const fs = require('fs');
const path = require('path');

fs.readdirSync('your-directory-path')
  .filter((file) => fs.lstatSync(path.join(folder, file)).isFile());
Elga answered 1/7, 2016 at 14:25 Comment(0)
T
23

Dependencies.

var fs = require('fs');
var path = require('path');

Definition.

// String -> [String]
function fileList(dir) {
  return fs.readdirSync(dir).reduce(function(list, file) {
    var name = path.join(dir, file);
    var isDir = fs.statSync(name).isDirectory();
    return list.concat(isDir ? fileList(name) : [name]);
  }, []);
}

Usage.

var DIR = '/usr/local/bin';

// 1. List all files in DIR
fileList(DIR);
// => ['/usr/local/bin/babel', '/usr/local/bin/bower', ...]

// 2. List all file names in DIR
fileList(DIR).map((file) => file.split(path.sep).slice(-1)[0]);
// => ['babel', 'bower', ...]

Please note that fileList is way too optimistic. For anything serious, add some error handling.

Toadstool answered 13/11, 2015 at 3:31 Comment(3)
I added an excludeDirs array argument also. It changes it enough so that maybe you should edit it instead (if you want it). Otherwise I'll add it in a different answer. gist.github.com/AlecTaylor/f3f221b4fb86b4375650Calondra
@AT Nice! You should post your own answer, as it's a useful extension. Let's keep this one featureless.Toadstool
this will cause an error if your input is a directory where you get /Users/user/Desktop/project/example/Users/user/Desktop/project/example/constraints.txtAllhallows
G
20

I'm assuming from your question that you don't want directories names, just files.

Directory Structure Example

animals
├── all.jpg
├── mammals
│   └── cat.jpg
│   └── dog.jpg
└── insects
    └── bee.jpg

Walk function

Credits go to Justin Maier in this gist

If you want just an array of the files paths use return_object: false:

const fs = require('fs').promises;
const path = require('path');

async function walk(dir) {
    let files = await fs.readdir(dir);
    files = await Promise.all(files.map(async file => {
        const filePath = path.join(dir, file);
        const stats = await fs.stat(filePath);
        if (stats.isDirectory()) return walk(filePath);
        else if(stats.isFile()) return filePath;
    }));

    return files.reduce((all, folderContents) => all.concat(folderContents), []);
}

Usage

async function main() {
   console.log(await walk('animals'))
}

Output

[
  "/animals/all.jpg",
  "/animals/mammals/cat.jpg",
  "/animals/mammals/dog.jpg",
  "/animals/insects/bee.jpg"
];
Gibbsite answered 13/2, 2020 at 18:17 Comment(5)
@justmaier & a.barbieri - thanks for the code and answer!Bechler
hi if i want to show folder as well so what should i do ? like ` [ "/animals/all.jpg", "/animals/mammals" "/animals/mammals/cat.jpg", "/animals/mammals/dog.jpg", "/animals/insects/bee.jpg" ]; ` any solutionDinerman
Hi @Aakash, try adding files.unshift(dir) berfore the last return of the async function. Anyway it'd be best if you could create a new question as it might help other people with the same need and receive better feedback. ;-)Gibbsite
hi @Gibbsite what if i want to read only starting 2 level folder what i have to do for ex: my directory look like this animals/mammals/name and i want to stop at mammal by providing some depth [ "/animals/all.jpg", "/animals/mammals/cat.jpg", "/animals/mammals/dog.jpg", "/animals/insects/bee.jpg" ];Dinerman
Please create a new question and copy/paste the link it here in the comments. I'll be happy to answer.Gibbsite
P
16

its just 2 lines of code:

fs=require('fs')
fs.readdir("./img/", (err,filename)=>console.log(filename))

Image: aakash4dev

Primus answered 2/5, 2022 at 11:29 Comment(0)
O
15

if someone still search for this, i do this:

import fs from 'fs';
import path from 'path';

const getAllFiles = dir =>
    fs.readdirSync(dir).reduce((files, file) => {
        const name = path.join(dir, file);
        const isDirectory = fs.statSync(name).isDirectory();
        return isDirectory ? [...files, ...getAllFiles(name)] : [...files, name];
    }, []);

and its work very good for me

Obnoxious answered 27/1, 2019 at 10:39 Comment(3)
Worked great for me AND it's recursive. Just remember that the import syntax is still behind a flag in Node, you might have to go the old way: const fs = require('fs');Uncommercial
@Obnoxious It works like charm. However, having a bit of difficulty to understand how the [...files, ...getAllFiles(name)] or [...files, name] works. A bit of explanation would be very helpful :)Telfer
@MdMazedulIslamKhan The ... used here is called a spread syntax. What it basically does is takes all objects inside the array and 'spreads' it into the new array. In this case, all entries inside the files array is added to the return along with all the values returned from the recursive call. YOu can refer to the spread syntax here: developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/…Prosenchyma
M
14

Load fs:

const fs = require('fs');

Read files async:

fs.readdir('./dir', function (err, files) {
    // "files" is an Array with files names
});

Read files sync:

var files = fs.readdirSync('./dir');
Mendicity answered 13/9, 2016 at 17:12 Comment(0)
H
13

My one liner code:

const fs = require("fs")
const path = 'somePath/'

const filesArray = fs.readdirSync(path).filter(file => fs.lstatSync(path+file).isFile())
Harappa answered 11/11, 2021 at 13:21 Comment(2)
Could you provide more details on what the code does and how it helps the OP?Sevier
It simply gets an array of file names from some path. Only names of files, not subdirectory names.Harappa
P
9

Get sorted filenames. You can filter results based on a specific extension such as '.txt', '.jpg' and so on.

import * as fs from 'fs';
import * as Path from 'path';

function getFilenames(path, extension) {
    return fs
        .readdirSync(path)
        .filter(
            item =>
                fs.statSync(Path.join(path, item)).isFile() &&
                (extension === undefined || Path.extname(item) === extension)
        )
        .sort();
}
Practitioner answered 21/2, 2018 at 14:45 Comment(0)
N
8

My 2 cents if someone:

Just want to list file names (excluding directories) from a local sub-folder on their project

  • ✅ No additional dependencies
  • ✅ 1 function
  • ✅ Normalize path (Unix vs. Windows)
const fs = require("fs");
const path = require("path");

/**
 * @param {string} relativeName "resources/foo/goo"
 * @return {string[]}
 */
const listFileNames = (relativeName) => {
  try {
    const folderPath = path.join(process.cwd(), ...relativeName.split("/"));
    return fs
      .readdirSync(folderPath, { withFileTypes: true })
      .filter((dirent) => dirent.isFile())
      .map((dirent) => dirent.name.split(".")[0]);
  } catch (err) {
    // ...
  }
};

README.md
package.json
resources
 |-- countries
    |-- usa.yaml
    |-- japan.yaml
    |-- gb.yaml
    |-- provinces
       |-- .........


listFileNames("resources/countries") #=> ["usa", "japan", "gb"]
Niobic answered 20/7, 2020 at 6:37 Comment(1)
You have the problem where path is the name of your imported require('path') but then you re-define const path inside the function... This is really confusing and might lead to bugs!Clevelandclevenger
H
8

Try this, it works for me

import fs from "fs/promises";

const path = "path/to/folder";

export const readDir = async function readDir(path) {

    const files = await fs.readdir(path);

    // array of file names
    console.log(files);
}
Heraldic answered 9/8, 2022 at 12:7 Comment(2)
It should be the accepted answer. No npm install needed and it works with esm import and async/await. For a full example, you should wrapt it in a function.Necrosis
No it should not, this returns files as well as folders, which is not what the OP asked for.Undersecretary
B
7

This is a TypeScript, optionally recursive, optionally error logging and asynchronous solution. You can specify a regular expression for the file names you want to find.

I used fs-extra, because its an easy super set improvement on fs.

import * as FsExtra from 'fs-extra'

/**
 * Finds files in the folder that match filePattern, optionally passing back errors .
 * If folderDepth isn't specified, only the first level is searched. Otherwise anything up
 * to Infinity is supported.
 *
 * @static
 * @param {string} folder The folder to start in.
 * @param {string} [filePattern='.*'] A regular expression of the files you want to find.
 * @param {(Error[] | undefined)} [errors=undefined]
 * @param {number} [folderDepth=0]
 * @returns {Promise<string[]>}
 * @memberof FileHelper
 */
public static async findFiles(
    folder: string,
    filePattern: string = '.*',
    errors: Error[] | undefined = undefined,
    folderDepth: number = 0
): Promise<string[]> {
    const results: string[] = []

    // Get all files from the folder
    let items = await FsExtra.readdir(folder).catch(error => {
        if (errors) {
            errors.push(error) // Save errors if we wish (e.g. folder perms issues)
        }

        return results
    })

    // Go through to the required depth and no further
    folderDepth = folderDepth - 1

    // Loop through the results, possibly recurse
    for (const item of items) {
        try {
            const fullPath = Path.join(folder, item)

            if (
                FsExtra.statSync(fullPath).isDirectory() &&
                folderDepth > -1)
            ) {
                // Its a folder, recursively get the child folders' files
                results.push(
                    ...(await FileHelper.findFiles(fullPath, filePattern, errors, folderDepth))
                )
            } else {
                // Filter by the file name pattern, if there is one
                if (filePattern === '.*' || item.search(new RegExp(filePattern, 'i')) > -1) {
                    results.push(fullPath)
                }
            }
        } catch (error) {
            if (errors) {
                errors.push(error) // Save errors if we wish
            }
        }
    }

    return results
}
Blockage answered 18/4, 2019 at 14:12 Comment(0)
V
7

Using flatMap:

function getFiles(dir) {
  return fs.readdirSync(dir).flatMap((item) => {
    const path = `${dir}/${item}`;
    if (fs.statSync(path).isDirectory()) {
      return getFiles(path);
    }

    return path;
  });
}

Given the following directory:

dist
├── 404.html
├── app-AHOLRMYQ.js
├── img
│   ├── demo.gif
│   └── start.png
├── index.html
└── sw.js

Usage:

getFiles("dist")

Output:

[
  'dist/404.html',
  'dist/app-AHOLRMYQ.js',
  'dist/img/demo.gif',
  'dist/img/start.png',
  'dist/index.html'
]
Viperous answered 18/10, 2021 at 1:33 Comment(0)
D
6

Out of the box

In case you want an object with the directory structure out-of-the-box I highly reccomend you to check directory-tree.

Lets say you have this structure:

photos
│   june
│   └── windsurf.jpg
└── january
    ├── ski.png
    └── snowboard.jpg
const dirTree = require("directory-tree");
const tree = dirTree("/path/to/photos");

Will return:

{
  path: "photos",
  name: "photos",
  size: 600,
  type: "directory",
  children: [
    {
      path: "photos/june",
      name: "june",
      size: 400,
      type: "directory",
      children: [
        {
          path: "photos/june/windsurf.jpg",
          name: "windsurf.jpg",
          size: 400,
          type: "file",
          extension: ".jpg"
        }
      ]
    },
    {
      path: "photos/january",
      name: "january",
      size: 200,
      type: "directory",
      children: [
        {
          path: "photos/january/ski.png",
          name: "ski.png",
          size: 100,
          type: "file",
          extension: ".png"
        },
        {
          path: "photos/january/snowboard.jpg",
          name: "snowboard.jpg",
          size: 100,
          type: "file",
          extension: ".jpg"
        }
      ]
    }
  ]
}

Custom Object

Otherwise if you want to create an directory tree object with your custom settings have a look at the following snippet. A live example is visible on this codesandbox.

// my-script.js
const fs = require("fs");
const path = require("path");

const isDirectory = filePath => fs.statSync(filePath).isDirectory();
const isFile = filePath => fs.statSync(filePath).isFile();

const getDirectoryDetails = filePath => {
  const dirs = fs.readdirSync(filePath);
  return {
    dirs: dirs.filter(name => isDirectory(path.join(filePath, name))),
    files: dirs.filter(name => isFile(path.join(filePath, name)))
  };
};

const getFilesRecursively = (parentPath, currentFolder) => {
  const currentFolderPath = path.join(parentPath, currentFolder);
  let currentDirectoryDetails = getDirectoryDetails(currentFolderPath);

  const final = {
    current_dir: currentFolder,
    dirs: currentDirectoryDetails.dirs.map(dir =>
      getFilesRecursively(currentFolderPath, dir)
    ),
    files: currentDirectoryDetails.files
  };

  return final;
};

const getAllFiles = relativePath => {
  const fullPath = path.join(__dirname, relativePath);
  const parentDirectoryPath = path.dirname(fullPath);
  const leafDirectory = path.basename(fullPath);

  const allFiles = getFilesRecursively(parentDirectoryPath, leafDirectory);
  return allFiles;
};

module.exports = { getAllFiles };

Then you can simply do:

// another-file.js 

const { getAllFiles } = require("path/to/my-script");

const allFiles = getAllFiles("/path/to/my-directory");
Directory answered 11/2, 2020 at 20:11 Comment(0)
T
5

Here's an asynchronous recursive version.

    function ( path, callback){
     // the callback gets ( err, files) where files is an array of file names
     if( typeof callback !== 'function' ) return
     var
      result = []
      , files = [ path.replace( /\/\s*$/, '' ) ]
     function traverseFiles (){
      if( files.length ) {
       var name = files.shift()
       fs.stat(name, function( err, stats){
        if( err ){
         if( err.errno == 34 ) traverseFiles()
    // in case there's broken symbolic links or a bad path
    // skip file instead of sending error
         else callback(err)
        }
        else if ( stats.isDirectory() ) fs.readdir( name, function( err, files2 ){
         if( err ) callback(err)
         else {
          files = files2
           .map( function( file ){ return name + '/' + file } )
           .concat( files )
          traverseFiles()
         }
        })
        else{
         result.push(name)
         traverseFiles()
        }
       })
      }
      else callback( null, result )
     }
     traverseFiles()
    }
Tiffanytiffi answered 1/3, 2014 at 20:57 Comment(3)
Get into the habit of adding semicolons to the end of your statements. You can't minify code otherwise. Nevertheless, thanks for the much needed async contribution.Dykes
HAHAHAHA that's not part of the spec, just some random person calling their preferred linting style "standardjs". Semicolons are good practice especially in Javascript to maintain code clarity. Otherwise you and your team must memorize the rules of automatic semicolon insertion, and I know at least the average JS developer where I work is not that diligent.Dykes
@Dykes But since ASI exists, we can use it, no? I use eslint and prettier to format my code on save regularly and semicolon insertion is a non-issue.Kensell
C
5

Took the general approach of @Hunan-Rostomyan, made it a litle more concise and added excludeDirs argument. It'd be trivial to extend with includeDirs, just follow same pattern:

import * as fs from 'fs';
import * as path from 'path';

function fileList(dir, excludeDirs?) {
    return fs.readdirSync(dir).reduce(function (list, file) {
        const name = path.join(dir, file);
        if (fs.statSync(name).isDirectory()) {
            if (excludeDirs && excludeDirs.length) {
                excludeDirs = excludeDirs.map(d => path.normalize(d));
                const idx = name.indexOf(path.sep);
                const directory = name.slice(0, idx === -1 ? name.length : idx);
                if (excludeDirs.indexOf(directory) !== -1)
                    return list;
            }
            return list.concat(fileList(name, excludeDirs));
        }
        return list.concat([name]);
    }, []);
}

Example usage:

console.log(fileList('.', ['node_modules', 'typings', 'bower_components']));
Calondra answered 15/1, 2016 at 0:28 Comment(2)
I have a main folder: scss, and inside it other folder: themes, but the final list give me all directories, not only directories without exclude directorie, whats happen?Nashbar
Only works fine with '.' folder directory, with the rest directories doesn't works.Nashbar
Z
2

I usually use: FS-Extra.

const fileNameArray = Fse.readdir('/some/path');

Result:

[
  "b7c8a93c-45b3-4de8-b9b5-a0bf28fb986e.jpg",
  "daeb1c5b-809f-4434-8fd9-410140789933.jpg"
]
Zincate answered 7/7, 2020 at 22:40 Comment(1)
If i need to read sub directories I mean to say recursive then how can fs-extra is useful @ZincateRevengeful
G
1

Just a heads up: if you're planning to perform operations on each file in a directory, try vinyl-fs (which is used by gulp, the streaming build system).

Geochemistry answered 12/6, 2014 at 5:54 Comment(0)
A
1

This will work and store the result in test.txt file which will be present in the same directory

  fs.readdirSync(__dirname).forEach(file => {
    fs.appendFileSync("test.txt", file+"\n", function(err){
    })
})
Antiperspirant answered 19/7, 2019 at 4:37 Comment(0)
G
1

I've recently built a tool for this that does just this... It fetches a directory asynchronously and returns a list of items. You can either get directories, files or both, with folders being first. You can also paginate the data in case where you don't want to fetch the entire folder.

https://www.npmjs.com/package/fs-browser

This is the link, hope it helps someone!

Genesis answered 11/4, 2020 at 21:3 Comment(0)
S
1

No npm install. This works for the current folder where you launch the terminal, but you can change process.cwd() to another folder. Enjoy!

const fs=require('fs');
fs.readdirSync(process.cwd());
Springclean answered 26/9, 2023 at 13:47 Comment(0)
A
1

The modern (Node 21) version of the current top listed answer using the promise-based fs api is:

import { readdir } from 'node:fs/promises'

const testFolder = "./tests/"

for (const file of await readdir(testFolder)) {
    console.log(file);
};

Amorete answered 27/3 at 2:42 Comment(0)
G
0

I made a node module to automate this task: mddir

Usage

node mddir "../relative/path/"

To install: npm install mddir -g

To generate markdown for current directory: mddir

To generate for any absolute path: mddir /absolute/path

To generate for a relative path: mddir ~/Documents/whatever.

The md file gets generated in your working directory.

Currently ignores node_modules, and .git folders.

Troubleshooting

If you receive the error 'node\r: No such file or directory', the issue is that your operating system uses different line endings and mddir can't parse them without you explicitly setting the line ending style to Unix. This usually affects Windows, but also some versions of Linux. Setting line endings to Unix style has to be performed within the mddir npm global bin folder.

Line endings fix

Get npm bin folder path with:

npm config get prefix

Cd into that folder

brew install dos2unix

dos2unix lib/node_modules/mddir/src/mddir.js

This converts line endings to Unix instead of Dos

Then run as normal with: node mddir "../relative/path/".

Example generated markdown file structure 'directoryList.md'

    |-- .bowerrc
    |-- .jshintrc
    |-- .jshintrc2
    |-- Gruntfile.js
    |-- README.md
    |-- bower.json
    |-- karma.conf.js
    |-- package.json
    |-- app
        |-- app.js
        |-- db.js
        |-- directoryList.md
        |-- index.html
        |-- mddir.js
        |-- routing.js
        |-- server.js
        |-- _api
            |-- api.groups.js
            |-- api.posts.js
            |-- api.users.js
            |-- api.widgets.js
        |-- _components
            |-- directives
                |-- directives.module.js
                |-- vendor
                    |-- directive.draganddrop.js
            |-- helpers
                |-- helpers.module.js
                |-- proprietary
                    |-- factory.actionDispatcher.js
            |-- services
                |-- services.cardTemplates.js
                |-- services.cards.js
                |-- services.groups.js
                |-- services.posts.js
                |-- services.users.js
                |-- services.widgets.js
        |-- _mocks
            |-- mocks.groups.js
            |-- mocks.posts.js
            |-- mocks.users.js
            |-- mocks.widgets.js
Grillo answered 28/10, 2017 at 3:52 Comment(0)
C
0

Use npm list-contents module. It reads the contents and sub-contents of the given directory and returns the list of files' and folders' paths.

const list = require('list-contents');

list("./dist",(o)=>{
  if(o.error) throw o.error;
   console.log('Folders: ', o.dirs);
   console.log('Files: ', o.files);
});
Canaveral answered 23/7, 2018 at 10:49 Comment(0)
B
0

If many of the above options seem too complex or not what you are looking for here is another approach using node-dir - https://github.com/fshost/node-dir

npm install node-dir

Here is a somple function to list all .xml files searching in subdirectories

import * as nDir from 'node-dir' ;

listXMLs(rootFolderPath) {
    let xmlFiles ;

    nDir.files(rootFolderPath, function(err, items) {
        xmlFiles = items.filter(i => {
            return path.extname(i) === '.xml' ;
        }) ;
        console.log(xmlFiles) ;       
    });
}
Badajoz answered 14/2, 2021 at 23:50 Comment(0)
G
0
const fs = require('fs');
const path = require('path');

function readFile(filePath) {
  return new Promise((resolve, reject) => {
    fs.readFile(filePath, 'utf8', (err, data) => {
      if (err) {
        reject(err);
      } else {
        resolve(data);
      }
    });
  });
}

function readFolderFiles(folderPath) {
  return new Promise((resolve, reject) => {
    fs.readdir(folderPath, { withFileTypes: true }, (err, files) => {
      if (err) {
        reject(err);
      } else {
        const filePromises = files.map((file) => {
          const filePath = path.join(folderPath, file.name);
          return readFile(filePath);
        });

        Promise.all(filePromises)
          .then((fileContents) => {
            resolve(fileContents);
          })
          .catch((err) => {
            reject(err);
          });
      }
    });
  });
}

// Usage example
const folderPath = './s3';

readFolderFiles(folderPath)
  .then((fileContents) => {
    fileContents.forEach((content, index) => {
      console.log(`File ${index + 1}:`);
      console.log(content);
      console.log('------------------');
    });
  })
  .catch((err) => {
    console.error('Error:', err);
  });
Guttering answered 12/6, 2023 at 12:13 Comment(0)
C
-1
function getFilesRecursiveSync(dir, fileList, optionalFilterFunction) {
    if (!fileList) {
        grunt.log.error("Variable 'fileList' is undefined or NULL.");
        return;
    }
    var files = fs.readdirSync(dir);
    for (var i in files) {
        if (!files.hasOwnProperty(i)) continue;
        var name = dir + '/' + files[i];
        if (fs.statSync(name).isDirectory()) {
            getFilesRecursiveSync(name, fileList, optionalFilterFunction);
        } else {
            if (optionalFilterFunction && optionalFilterFunction(name) !== true)
                continue;
            fileList.push(name);
        }
    }
}
Circosta answered 1/4, 2014 at 15:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.