Is there the equivalent for a `find` command in `hadoop`?
Asked Answered
M

4

11

I know that from the terminal, one can do a find command to find files such as :

find . -type d -name "*something*" -maxdepth 4 

But, when I am in the hadoop file system, I have not found a way to do this.

hadoop fs -find ....

throws an error.

How do people traverse files in hadoop? I'm using hadoop 2.6.0-cdh5.4.1.

Morningglory answered 1/10, 2015 at 20:34 Comment(2)
It "throws an error"? What error? find is what I expect most people use.Sapphera
for future help-seekers, on hadoop 2.6.0-cdh5.4.1, it seems that this doesn't work: hadoop fs -ls -R <pattern>, but a reasonable solution is this: hadoop fs -ls -R <filepath> | egrep <regex_pattern>Morningglory
P
13

hadoop fs -find was introduced in Apache Hadoop 2.7.0. Most likely you're using an older version hence you don't have it yet. see: HADOOP-8989 for more information.

In the meantime you can use

hdfs dfs -ls -R <pattern>

e.g,: hdfs dfs -ls -R /demo/order*.*

but that's not as powerful as 'find' of course and lacks some basics. From what I understand people have been writing scripts around it to get over this problem.

Phosphine answered 1/10, 2015 at 20:52 Comment(1)
Thanks. Any idea how to use the hadoop fs -find "expression" option? The docs say: The following operators are recognised: expression -a expression expression -and expression expression expression but i have no idea what this means.`Blackhead
A
7

If you are using the Cloudera stack, try the find tool:

org.apache.solr.hadoop.HdfsFindTool

Set the command to a bash variable:

COMMAND='hadoop jar /opt/cloudera/parcels/CDH/lib/solr/contrib/mr/search-mr-job.jar org.apache.solr.hadoop.HdfsFindTool'

Usage as follows:

${COMMAND} -find . -name "something" -type d ...
Axel answered 4/4, 2017 at 9:51 Comment(0)
P
1

It you don't have the cloudera parcels available you can use awk.

hdfs dfs -ls -R /some_path | awk -F / '/^d/ && (NF <= 5) && /something/' 

that's almost equivalent to the find . -type d -name "*something*" -maxdepth 4 command.

Protero answered 30/10, 2018 at 13:22 Comment(0)
R
-1

adding HdfsFindTool as alias in .bash_profile,will make it easy to use always.

--add below to profile alias hdfsfind='hadoop jar /opt/cloudera/parcels/CDH/lib/solr/contrib/mr/search-mr-job.jar org.apache.solr.hadoop.HdfsFindTool' alias hdfs='hadoop fs'

--u can use as follows now :(here me using find tool to get HDFS source folder wise File name and record counts.)

$> cnt=1;for ff in hdfsfind -find /dev/abc/*/2018/02/16/*.csv -type f; do pp=echo ${ff}|awk -F"/" '{print $7}';fn=basename ${ff}; fcnt=hdfs -cat ${ff}|wc -l; echo "${cnt}=${pp}=${fn}=${fcnt}"; cnt=expr ${cnt} + 1; done

--simple to get folder /file details: $> hdfsfind -find /dev/abc/ -type f -name "*.csv" $> hdfsfind -find /dev/abc/ -type d -name "toys"

Ringer answered 19/2, 2018 at 1:54 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.