List files recursively in Linux CLI with path relative to the current directory
Asked Answered
L

14

254

This is similar to this question, but I want to include the path relative to the current directory in unix. If I do the following:

ls -LR | grep .txt

It doesn't include the full paths. For example, I have the following directory structure:

test1/file.txt
test2/file1.txt
test2/file2.txt

The code above will return:

file.txt
file1.txt
file2.txt

How can I get it to include the paths relative to the current directory using standard Unix commands?

Lefevre answered 29/10, 2008 at 3:28 Comment(2)
This just shows that ls is missing this feature.Folio
It's a shame all of these solutions require find or tree. I'm ssh'ing into an android device where I appear to only have ls, and none of these other tools :/Ketchum
S
347

Use find:

find . -name \*.txt -print

On systems that use GNU find, like most GNU/Linux distributions, you can leave out the -print.

Samantha answered 29/10, 2008 at 3:34 Comment(6)
...and for that matter you can leave out the '.'Descombes
For absolute paths, use find $(pwd) -name \*.txtSewer
If pattern has part of the directory name should use -path "*docs/*.txt" instead of -name. -wholename is same as -pathFilbert
Use -type f to only return files and not directories, symbolic links, etc.Josephinajosephine
-name "*.txt" will NOT match file.TXT — always use -iname "*.txt" instead which is case-insensitive.Glasser
Even simpler: find -name "*.txt" Works on Ubuntu systems at least, which I guess means it works on Debian systems too.Hutchison
P
77

Use tree, with -f (full path) and -i (no indentation lines):

tree -if --noreport .
tree -if --noreport directory/

You can then use grep to filter out the ones you want.


If the command is not found, you can install it:

Type following command to install tree command on RHEL/CentOS and Fedora linux:

# yum install tree -y

If you are using Debian/Ubuntu, Mint Linux type following command in your terminal:

$ sudo apt-get install tree -y
Patin answered 28/4, 2010 at 0:45 Comment(1)
any setting to list only the files, and exclude folders?Proficiency
T
27

Try find. You can look it up exactly in the man page, but it's sorta like this:

find [start directory] -name [what to find]

so for your example

find . -name "*.txt"

should give you what you want.

Tetrode answered 29/10, 2008 at 3:33 Comment(0)
P
12

You could use find instead:

find . -name '*.txt'
Photophilous answered 29/10, 2008 at 3:36 Comment(0)
S
7

To get the actual full path file names of the desired files using the find command, use it with the pwd command:

find $(pwd) -name \*.txt -print
Spencer answered 30/12, 2011 at 5:49 Comment(0)
A
6

That does the trick:

ls -R1 $PWD | while read l; do case $l in *:) d=${l%:};; "") d=;; *) echo "$d/$l";; esac; done | grep -i ".txt"

But it does that by "sinning" with the parsing of ls, though, which is considered bad form by the GNU and Ghostscript communities.

Ankylostomiasis answered 14/3, 2016 at 21:41 Comment(1)
@CAFEBABE: it is considered taboo to parse the results of ls because it generally leads to bugs. The sentence in question was trying to call that out. It also had a joke in it that I have removed.Paroicous
C
4
DIR=your_path
find $DIR | sed 's:""$DIR""::'

'sed' will erase 'your_path' from all 'find' results. And you recieve relative to 'DIR' path.

Casilde answered 15/10, 2009 at 11:4 Comment(2)
this is exactly what i need but it still spits out the full path for me?Predestinarian
don't use the apostrophe's - find /tmp | sed s:"/tmp":: | moreDeservedly
S
1

Here is a Perl script:

sub format_lines($)
{
    my $refonlines = shift;
    my @lines = @{$refonlines};
    my $tmppath = "-";

    foreach (@lines)
    {
        next if ($_ =~ /^\s+/);
        if ($_ =~ /(^\w+(\/\w*)*):/)
        {
            $tmppath = $1 if defined $1;    
            next;
        }
        print "$tmppath/$_";
    }
}

sub main()
{
        my @lines = ();

    while (<>) 
    {
        push (@lines, $_);
    }
    format_lines(\@lines);
}

main();

usage:

ls -LR | perl format_ls-LR.pl
Summit answered 27/11, 2009 at 9:45 Comment(1)
This is a terrible Perl program. It's probably 2-3 times as long, and complicated, than it needs to be.Investigation
B
1

You could create a shell function, e.g. in your .zshrc or .bashrc:

filepath() {
    echo $PWD/$1
}

filepath2() {
    for i in $@; do
        echo $PWD/$i
    done
}

The first one would work on single files only, obviously.

Basie answered 30/3, 2011 at 18:50 Comment(0)
B
1

Find the file called "filename" on your filesystem starting the search from the root directory "/". The "filename"

find / -name "filename" 
Bergama answered 23/2, 2013 at 2:11 Comment(0)
O
1

If you want to preserve the details come with ls like file size etc in your output then this should work.

sed "s|<OLDPATH>|<NEWPATH>|g" input_file > output_file
Oversweet answered 21/8, 2013 at 14:41 Comment(0)
B
1

In the fish shell, you can do this to list all pdfs recursively, including the ones in the current directory:

$ ls **pdf

Just remove 'pdf' if you want files of any type.

Bruch answered 23/4, 2019 at 17:31 Comment(0)
G
0

You can implement this functionality like this
Firstly, using the ls command pointed to the targeted directory. Later using find command filter the result from it. From your case, it sounds like - always the filename starts with a word file***.txt

ls /some/path/here | find . -name 'file*.txt'   (* represents some wild card search)
Gayton answered 13/4, 2014 at 5:49 Comment(0)
E
0

In mycase, with tree command

Relative path

tree -ifF ./dir | grep -v '^./dir$' | grep -v '.*/$' | grep '\./.*' | while read file; do
  echo $file
done

Absolute path

tree -ifF ./dir | grep -v '^./dir$' | grep -v '.*/$' | grep '\./.*' | while read file; do
  echo $file | sed -e "s|^.|$PWD|g"
done
Excavation answered 5/3, 2020 at 9:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.