When using sudo rm -r
, how can I delete all files, with the exception of the following:
textfile.txt
backup.tar.gz
script.php
database.sql
info.txt
When using sudo rm -r
, how can I delete all files, with the exception of the following:
textfile.txt
backup.tar.gz
script.php
database.sql
info.txt
find [path] -type f -not -name 'textfile.txt' -not -name 'backup.tar.gz' -delete
If you don't specify -type f
find will also list directories, which you may not want.
Or a more general solution using the very useful combination find | xargs
:
find [path] -type f -not -name 'EXPR' -print0 | xargs -0 rm --
for example, delete all non txt-files in the current directory:
find . -type f -not -name '*txt' -print0 | xargs -0 rm --
The print0
and -0
combination is needed if there are spaces in any of the filenames that should be deleted.
find . -type f -not -name '*ignore1' -not -name '*ignore2' | xargs rm
–
Own -not -name ".*" -maxdepth 1
to protect configuration and version control metadata. –
Solecism rm -f --
) as this command throws a rm: missing operand
error otherwise which may be problematic if the command is used in a script. –
Cyperaceous -print0 | xargs -0 rm --
you can just -delete
–
Nonah rme textfile.txt backup.tar.gz
–
Esquibel -delete
but removing -type f
is actually really handy, because it deletes all directories which end up empty from deleting the files, but keeps the ones that happen to have your kept files in them –
Nitrous rm !(textfile.txt|backup.tar.gz|script.php|database.sql|info.txt)
The extglob (Extended Pattern Matching) needs to be enabled in BASH (if it's not enabled):
shopt -s extglob
shopt -s extglob; rm -rf !(README|LICENSE)
. Any idea why? –
Alarice find
with the -delete
option. –
Alarice extglob
is OFF by default, so if you find it's on already, something must have turned it on, such as your bash profile (either directly or by sourcing a script that turns it on, which happens with careless 3rd-party scripts). –
Fleabane bash
was used, e.g., dash
, where extglob
is not supported. However, in an interactive bash
shell, the command will ALSO not work as stated, though for different reasons. The short of it: execute shopt -s extglob
BY ITSELF; ONCE SUCCESSFULLY ENABLED (verify with shopt extglob
), execute rm -rf !(README|LICENSE)
. (While extglob
is not yet enabled, !(...)
is evaluated by history expansion BEFORE the commands are executed; since this likely fails, NEITHER command is executed, and extglob
is never turned on.) –
Fleabane shopt -s extglob
run in the command line interface, I had to rerun it inside my script file. This solved the problem for me. –
Khan rm -rf * .* !(src/|lost+found/)
<-- it didn't work as I expect, it deleted everything ;) –
Ocher find . | grep -v "excluded files criteria" | xargs rm
This will list all files in current directory, then list all those that don't match your criteria (beware of it matching directory names) and then remove them.
Update: based on your edit, if you really want to delete everything from current directory except files you listed, this can be used:
mkdir /tmp_backup && mv textfile.txt backup.tar.gz script.php database.sql info.txt /tmp_backup/ && rm -r && mv /tmp_backup/* . && rmdir /tmp_backup
It will create a backup directory /tmp_backup
(you've got root privileges, right?), move files you listed to that directory, delete recursively everything in current directory (you know that you're in the right directory, do you?), move back to current directory everything from /tmp_backup
and finally, delete /tmp_backup
.
I choose the backup directory to be in root, because if you're trying to delete everything recursively from root, your system will have big problems.
Surely there are more elegant ways to do this, but this one is pretty straightforward.
find . | egrep -v "\.tex|\.bib" | xargs rm
–
Illuminant find . -maxdepth 1 | grep -v "exclude these" | xargs rm -r
works much faster as it doesn't need to drill down in to directories unnecessarily. –
Cano find
pipeline: efficiency issues aside (3 commands are used for what find
could do alone), this will not work as intended with filenames with embedded spaces and will potentially delete the wrong files. –
Fleabane /tmp_backup
) doesn't end well if it's interrupted—from the user's perspective, all the files have vanished, unless they know where to go looking for them to get them back. For that reason I'm not in favour of this type of strategy. –
Gleeson I prefer to use sub query list:
rm -r `ls | grep -v "textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt"`
-v, --invert-match select non-matching lines
\|
Separator
To avoid preserving files with similar names:
rm -r `ls | grep -v "^textfile.txt$\|^backup.tar.gz$"`
old-textfile.txt
, etc. –
Downfall -v
gives rm: missing operand
–
Indopacific Assuming that files with those names exist in multiple places in the directory tree and you want to preserve all of them:
find . -type f ! -regex ".*/\(textfile.txt\|backup.tar.gz\|script.php\|database.sql\|info.txt\)" -delete
You can use GLOBIGNORE environment variable in Bash.
Suppose you want to delete all files except php and sql, then you can do the following -
export GLOBIGNORE=*.php:*.sql
rm *
export GLOBIGNORE=
Setting GLOBIGNORE like this ignores php and sql from wildcards used like "ls *" or "rm *". So, using "rm *" after setting the variable will delete only txt and tar.gz file.
GLOBIGNORE
variable is to use a subshell: (GLOBIGNORE='*.php:*.sql'; rm *)
–
Fleabane Since nobody mentioned it:
cp
or rsync
to preserve permissions. Anyways, this is just an alternate method (given as a suggestion) that has its place here, as an answer to the OP. –
Lookeron :)
. –
Lookeron You can write a for loop for this... %)
for x in *
do
if [ "$x" != "exclude_criteria" ]
then
rm -f $x;
fi
done;
A little late for the OP, but hopefully useful for anyone who gets here much later by google...
I found the answer by @awi and comment on -delete by @Jamie Bullock really useful. A simple utility so you can do this in different directories ignoring different file names/types each time with minimal typing:
rm_except (or whatever you want to name it)
#!/bin/bash
ignore=""
for fignore in "$@"; do
ignore=${ignore}"-not -name ${fignore} "
done
find . -type f $ignore -delete
e.g. to delete everything except for text files and foo.bar:
rm_except *.txt foo.bar
Similar to @mishunika, but without the if clause.
If you're using zsh
which I highly recommend.
rm -rf ^file/folder pattern to avoid
With extended_glob
setopt extended_glob
rm -- ^*.txt
rm -- ^*.(sql|txt)
Trying it worked with:
rm -r !(Applications|"Virtualbox VMs"|Downloads|Documents|Desktop|Public)
but names with spaces are (as always) tough. Tried also with Virtualbox\ VMs
instead the quotes. It deletes always that directory (Virtualbox VMs
).
rm !(myfile.txt)
removes all including myfile.txt
–
Tolley shopt -s extglob
first as @Predation pointed out –
Minyan shopt -s extglob
and then cd /Users/alumno/
and finally rm -rf !(Applications|Virtualbox*VMs|Downloads|Documents|Desktop|Public|Library)
Read about extended globbing here –
Vagabondage Just:
rm $(ls -I "*.txt" ) #Deletes file type except *.txt
Or:
rm $(ls -I "*.txt" -I "*.pdf" ) #Deletes file types except *.txt & *.pdf
ls
output can become a problem is. See Here for detailed information. –
Greenroom Make the files immutable. Not even root will be allowed to delete them.
chattr +i textfile.txt backup.tar.gz script.php database.sql info.txt
rm *
All other files have been deleted.
Eventually you can reset them mutable.
chattr -i *
I belive you can use
rm -v !(filename)
Except for the filename all the other files will e deleted in the directory and make sure you are using it in
This is similar to the comment from @siwei-shen but you need the -o
flag to do multiple patterns. The -o
flag stands for 'or'
find . -type f -not -name '*ignore1' -o -not -name '*ignore2' | xargs rm
You can do this with two command sequences. First define an array with the name of the files you do not want to exclude:
files=( backup.tar.gz script.php database.sql info.txt )
After that, loop through all files in the directory you want to exclude, checking if the filename is in the array you don't want to exclude; if its not then delete the file.
for file in *; do
if [[ ! " ${files[@]} " ~= "$file" ]];then
rm "$file"
fi
done
The answer I was looking for was to run script, but I wanted to avoid deleting the sript itself. So incase someone is looking for a similar answer, do the following.
Create a .sh file and write the following code:
cp my_run_build.sh ../../
rm -rf * cp
../../my_run_build.sh .
/*amend rest of the script*/
Since no one yet mentioned this, in one particular case:
OLD_FILES=`echo *`
... create new files ...
rm -r $OLD_FILES
(or just rm $OLD_FILES
)
or
OLD_FILES=`ls *`
... create new files ...
rm -r $OLD_FILES
You may need to use shopt -s nullglob
if some files may be either there or not there:
SET_OLD_NULLGLOB=`shopt -p nullglob`
shopt -s nullglob
FILES=`echo *.sh *.bash`
$SET_OLD_NULLGLOB
without nullglob, echo *.sh *.bash
may give you "a.sh b.sh *.bash".
(Having said all that, I myself prefer this answer, even though it does not work in OSX)
Rather than going for a direct command, please move required files to temp dir outside current dir. Then delete all files using rm *
or rm -r *
.
Then move required files to current dir.
Remove everything exclude file.name:
ls -d /path/to/your/files/* |grep -v file.name|xargs rm -rf
© 2022 - 2024 — McMap. All rights reserved.
rm -r
implies - delete everything else, including subdirectories - even if they contain files with the specified names; OR: (b) traverse the entire subtree of the target directory and, in each directory, delete all files except those with the names listed. – Fleabane.git
, and not having pushed, I was unable to recover over 30 commits. Make sure you exclude everything you care about, hidden folders included. And set-maxdepth 1
if you're dealing with directories. – Ishmael