Run git pull over all subdirectories [duplicate]
Asked Answered
O

16

348

How can I update multiple git repositories from their shared parent's directory without cd'ing into each repo's root directory? I have the following which are all separate git repositories (not submodules):

/plugins/cms
/plugins/admin
/plugins/chart

I want to update them all at once or at least simplify my current workflow:

cd ~/plugins/admin
git pull origin master
cd ../chart
git pull

etc.

Objectify answered 16/8, 2010 at 20:39 Comment(6)
What's wrong with find -name .git -execdir git pull \;?Directive
what about git do pullGride
The same question answered for hg mercurial.Cuprum
find . -name .git -print -execdir git pull \; is OK. -print will echo the current dir.Maximinamaximize
See also, with Git 2.30 (Q4 2020), the [new git for-each-repo command] (https://mcmap.net/q/94135/-is-there-any-way-to-list-all-git-repositories-in-terminal)Katelyn
github.com/earwig/git-repo-updater“gitup is a tool for updating multiple git repositories at once. It is smart enough to handle several remotes, dirty working directories, diverged local branches, detached HEADs, and more. It was originally created to manage a large collection of projects and deal with sporadic internet access. gitup should work on macOS, Linux, and Windows. You should have the latest version of git and either Python 2.7 or Python 3 installed.”Anima
W
438

Run the following from the parent directory, plugins in this case:

find . -type d -depth 1 -exec git --git-dir={}/.git --work-tree=$PWD/{} pull origin master \;

To clarify:

  • find . searches the current directory
  • -type d to find directories, not files
  • -depth 1 for a maximum depth of one sub-directory
  • -exec {} \; runs a custom command for every find
  • git --git-dir={}/.git --work-tree=$PWD/{} pull git pulls the individual directories

To play around with find, I recommend using echo after -exec to preview, e.g.:

find . -type d -depth 1 -exec echo git --git-dir={}/.git --work-tree=$PWD/{} status \;

Note: if the -depth 1 option is not available, try -mindepth 1 -maxdepth 1.

Wretched answered 19/9, 2012 at 13:0 Comment(14)
Thanks @batandwa for your remark about the use of -d.Wretched
find: warning: you have specified the -depth option after a non-option argument -type, but options are not positional (-depth affects tests specified before it as well as those specified after it). Please specify options before other arguments.Conney
I used find . -maxdepth 1 -type d -print -execdir git --git-dir={}/.git --work-tree=$PWD/{} pull origin master \; to output the name of the folder before doing the pull, to get rid of the warning and to only run the pull on subfolders.Gadmon
replacing 'pull origin master' with fetch origin master:master tells git to explicitly update your 'master' branch with origin's master branch. This will not do a merge, any commits to master will be lost if you do this.Zanthoxylum
since git 1.8.5 it is possible to replace --git-dir and --work-tree by the -C option, see this question. -- I'm using find . -mindepth 1 -maxdepth 1 -type d -print -exec git -C {} pull \;Stubborn
Didnt´t work unfortunately: "find: warning: you have specified the -depth option after a non-option argument -type, but options are not positional (-depth affects tests specified before it as well as those specified after it). Please specify options before other arguments. find: paths must precede expression: 1 Usage: find [-H] [-L] [-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...] [expression] "Cephalic
@ZsoltSzilagy as mentioned by @Gadmon you can use -maxdepth 1 instead of -depth 1Anthony
Note that when using this solution for bare repositories you have to omit --work-tree and leave only --git-dir, i.e. find . -type d -depth 1 -exec echo git --git-dir={}/ remote update \;Yoicks
I use this to avoid hidden folders find . -not -path '*/\.*' -mindepth 1 -maxdepth 1 -type d -print -exec git -C {} pull \;Certitude
Can we make each git pull run in parallel?Trioecious
Also all the spaces in this line are sensitive I found out.Flatfoot
After following some of the comments that explain that depth has changed to maxdepth, I get an error 'missing argument to -exec'.Passionate
Thank you for this answer, this is what I came up with: find . -type d -name ".git" \ -a -exec echo {} \; \ -exec git --git-dir={} --work-tree=$PWD/{}/../ status \; \ -exec git --git-dir={} --work-tree=$PWD/{}/../ pull \; \ -exec git --git-dir={} --work-tree=$PWD/{}/../ fetch -ap \;Labourer
depending on how you use, you might add --rebase: find . -type d -depth 1 -exec git --git-dir={}/.git --work-tree=$PWD/{} pull --rebase origin master \;Arlenaarlene
L
309
ls | xargs -I{} git -C {} pull

To do it in parallel:

ls | xargs -P10 -I{} git -C {} pull
Landan answered 16/3, 2015 at 8:51 Comment(12)
Nice! I've put it as an alias in my .gitconfig: all = "!f() { ls | xargs -I{} git -C {} $1; }; f" Now I can do git all pull, git all "checkout master" etc.Tracheo
Cleaned up a bit, will search all directories recursively for only git repos, and will strip out colors in case you have ls aliased ls -R --directory --color=never */.git | sed 's/\/.git//' | xargs -P10 -I{} git -C {} pullChaliapin
Here's the git config command to add it as an alias: git config --global alias.all '!f() { ls | xargs -I{} git -C {} $1; }; f'Unreal
I smashed some of the answers together to create this for git on macOS that filters on folder that contain a .git folder, and lets you run arbitrary commands like git all fetch --prune: git config --global alias.all '!f() { ls -R -d */.git | sed 's,\/.git,,' | xargs -P10 -I{} git -C {} $1; }; f'Taft
@CourtneyFaulkner This looks like it assumes a .git directory is available where the command is run. Am I right? Seems different from the original question where just the subdirectories contain .git dirs.Valer
@Valer actually, ls -R -d */.git is returning a filtered list of the directories within the current folder that contain a .git directory. That way, when I run something like git all fetch, it only executes against subfolders that have .git folders. It's an answer to the original question, but it tries to be a bit more efficient by not assuming all the subdirectories are git repos.Taft
A slight improvement over borisdiakur command, to avoid running on . and to have a printed list of on which directory it's running at each instant: git config --global alias.all '!f() { ls -R -d */.git | xargs -I{} bash -c "echo {} && git -C {}/../ $1"; }; f'Directly
@Tracheo Is there a trick to run the pull in parallel in all directories ?Vivien
I'm getting xargs: unknown option -- CSelfpreservation
If you want to avoid non git directories it will be bash ls -R -d */.git | cut -d'.' -f1 | xargs -I{} git -C {} pull Megavolt
Why this answer isn't on top?Nailbiting
this answer is the best: ls | xargs -P10 -I{} git -C {} pullBelgrade
L
128

A bit more low-tech than leo's solution:

for i in */.git; do ( echo $i; cd $i/..; git pull; ); done

This will update all Git repositories in your working directory. No need to explicitly list their names ("cms", "admin", "chart"). The "cd" command only affects a subshell (spawned using the parenthesis).

Longspur answered 5/2, 2015 at 15:40 Comment(11)
Exactly what I was looking forDutybound
This one has the advantage of displaying which repository it is dealing with, usefeul when something goes wrong (missing branch, un-available remote..)Mariann
This worked great. I added a crontab -e file to run this every 5 minutes and it appears to do exactly what I was hoping for.Ordonez
You inspired me. Many thanks.Maximinamaximize
I like this solution because it only pulls on sub-directories that are a git repo, thx!Faust
Alternate using git -C: for i in */.git; do git -C $i pull; doneCattier
Is there any way to automate typing in the username and password for each subfolder repo?Snapback
@colinodowd: Yes :-) This is independent of this one-liner; you'll have to tell git about the credentials in each individual case. How this is done depends on whether you're accessing your repos using https or ssh.Longspur
@IngoBlechschmidt Thanks Ingo, I figured it out through another SO thread :-) I love how automated my process is now!Snapback
This is the most straightforward solution IMO. Just add a '&' to the end of git pull to make it async so you're not waiting for each pull to complete before looping to the next item: for i in */.git; do ( echo $i; cd $i/..; git pull &) ; doneChuch
Simply, easy, working and easily configurable. THX!Concomitance
G
58

Actually, if you don't know if the subfolders have a git repo or not, the best would be to let find get the repos for you:

find . -type d -name .git -exec git --git-dir={} --work-tree=$PWD/{}/.. pull origin master \;

The PowerShell equivalent would be:

Get-ChildItem -Recurse -Directory -Hidden -Filter .git | ForEach-Object { & git --git-dir="$($_.FullName)" --work-tree="$(Split-Path $_.FullName -Parent)" pull origin master }
Garibald answered 15/11, 2016 at 3:24 Comment(6)
This works with my macOS.Charmion
Ensure all your git repos are on master before executing this as written. Otherwise you may be unintentionally merging master into your current branch.Parsnip
Thank you for the powershell version, still works with PS 7.10Scorpius
For PS 2.0 this works: gci | where {$_.Attributes -match'Directory'} | foreach { write-host $_.fullname; push-location $_; & git pull; & cd ..}Maccabees
Note - I could not find a hidden parameter, but Force seemed to work, as in Get-ChildItem -Path "C:\Repos" -Recurse -Directory -Force -Filter ".git" -Depth 1Coraleecoralie
Is there a way to have it act on either master or main based on whichever is used?Juvenilia
M
33

I use this one:

find . -name ".git" -type d | sed 's/\/.git//' |  xargs -P10 -I{} git -C {} pull

Universal: Updates all git repositories that are below current directory.

Mccubbin answered 29/1, 2016 at 9:29 Comment(1)
It's dangerous to use --git-dir without --work-tree which is why the -C shortcut was created. I just messed up my home directory because of this. I would recommend just doing find . -maxdepth 8 -type d -name .git | xargs -P8 -I{} git -C {}/../ fetch --allReeher
S
16

None of the top 5 answers worked for me, and the question talked about directories.

This worked:

for d in *; do pushd $d && git pull && popd; done
Semicolon answered 10/7, 2016 at 12:37 Comment(4)
For Windows, see my answer here: https://mcmap.net/q/94136/-updating-all-repos-in-a-folder. It's very similar to above.Geter
The accepted answer used to work for me, but it quit sometime in the last year. I finally decided to look for another solution and found your answer. Thanks for sharing. It works perfectly.Madeline
no need to push and pup, just use git's -C option.Fondle
Another option is to use a subshell: for d in *; do (cd $d && git pull --ff-only); done This has an advantage over -C as this approach is universally applicable to programs which don't have such option. Note that --ff-only is good to have in case of automatic updates; I use it to update vim plugins if native vim package management is used.Begat
K
14

This should happen automatically, so long as cms, admin and chart are all parts of the repository.

A likely issue is that each of these plugins is a git submodule.

Run git help submodule for more information.

EDIT

For doing this in bash:

cd plugins
for f in cms admin chart
do 
  cd $f && git pull origin master && cd ..
done
Kootenay answered 16/8, 2010 at 20:45 Comment(6)
No, sorry you misunderstood. Each of those directories are a separate git repository. /plugins is not a repositoryObjectify
Ahhh. My mistake. Will give you the bash solution in a minute.Kootenay
There you go. If you want to return to the parent directory, just run another cd .. afterwards.Kootenay
Or use pushd and popd or put the group of commands in a subshell (when the subshell exits, you'll be left in the original directory). (cd dir; for ... done)Tumble
Cool, that works. Now is there a way to pass the ssh password to the bash script and have the script pass it to git/ssh instead of prompting every time for my password?Objectify
Out of curiousity - why are aren't you using ssh keys instead?Kootenay
G
12

The mr utility (a.k.a., myrepos) provides an outstanding solution to this very problem. Install it using your favorite package manager, or just grab the mr script directly from github and put it in $HOME/bin or somewhere else on your PATH. Then, cd to the parent plugins folder shared by these repos and create a basic .mrconfig file with contents similar to the following (adjusting the URLs as needed):

# File: .mrconfig
[cms]
checkout = git clone 'https://<username>@github.com/<username>/cms' 'cms'

[admin]
checkout = git clone 'https://<username>@github.com/<username>/admin' 'admin'

[chart]
checkout = git clone 'https://<username>@github.com/<username>/chart' 'chart'

After that, you can run mr up from the top level plugins folder to pull updates from each repository. (Note that this will also do the initial clone if the target working copy doesn't yet exist.) Other commands you can execute include mr st, mr push, mr log, mr diff, etc—run mr help to see what's possible. There's a mr run command that acts as a pass-through, allowing you to access VCS commands not directly suported by mr itself (e.g., mr run git tag STAGING_081220015). And you can even create your own custom commands that execute arbitrary bits of shell script targeting all repos!

mr is an extremely useful tool for dealing with multiple repos. Since the plugins folder is in your home directory, you might also be interested in vcsh. Together with mr, it provides a powerful mechanism for managing all of your configuration files. See this blog post by Thomas Ferris Nicolaisen for an overview.

Gaza answered 17/8, 2015 at 23:27 Comment(0)
P
12

Most compact method, assuming all sub-dirs are git repos:

ls | parallel git -C {} pull
Puff answered 6/11, 2015 at 0:17 Comment(2)
can't find command parallel. reference?Sanctuary
@LeonTepe This tool is usually included with moreutils package.Puff
O
11

My humble construction that

  • shows the current path (using python, convenient and just works, see How to get full path of a file?)
  • looks directly for .git subfolder: low chance to emit a git command in a non-git subfolder
  • gets rid of some warnings of find

as follow:

find . \
    -maxdepth 2 -type d \
    -name ".git" \
    -execdir python -c 'import os; print(os.path.abspath("."))' \; \
    -execdir git pull \;

Of course, you may add other git commands with additional -execdir options to find, displaying the branch for instance:

find . \
    -maxdepth 2 -type d \
    -name ".git" \
    -execdir python -c 'import os; print(os.path.abspath("."))' \; \
    -execdir git branch \;
    -execdir git pull \;
Otoole answered 10/11, 2017 at 8:27 Comment(2)
Not sure why there's net downvotes on this answer. This was the most helpful, imo, since I could go to a greater max depth and not keep hitting non-git repos. I used this to run git gc on all all the repositories in my "developer" repo.Hibbler
I also like this answer as it's easy to read and very easy to add multiple commands.Dill
I
5

You can try this

find . -type d -name .git -exec sh -c "cd \"{}\"/../ && pwd && git pull" \;

Also, you can add your customized output by adding one more && argument like.

find . -type d -name .git -exec sh -c "cd \"{}\"/../ && pwd && git pull && git status" \;
Interphase answered 3/4, 2015 at 9:58 Comment(0)
R
5

gitfox is a tool to execute command on all subrepos

npm install gitfox -g
g pull
Reeder answered 13/1, 2017 at 6:57 Comment(3)
what's g? not everyone has the same alias as yoursArthurarthurian
@LưuVĩnhPhúc gitfox installs itself under the alias "g" for some reason (though the help message says "gitfox"). Personally I do not think it's a command important enough to claim such a shortcut but ah well. It does the job, though.Hawkshaw
@LưuVĩnhPhúc Check the source repo for the usage github.com/eqfox/gitfoxSelfsame
F
5

I combined points from several comments and answers:

find . -maxdepth 1 -type d -name .git -execdir git pull \;
Fairground answered 11/9, 2017 at 14:59 Comment(0)
N
2

I use this

for dir in $(find . -name ".git")
do cd ${dir%/*}
    echo $PWD
    git pull
    echo ""
    cd - > /dev/null
done

Github

Navvy answered 6/4, 2016 at 13:10 Comment(0)
K
0

Original answer 2010:

If all of those directories are separate git repo, you should reference them as submodules.

That means your "origin" would be that remote repo 'plugins' which only contains references to subrepos 'cms', 'admin', 'chart'.

A git pull followed by a git submodule update would achieve what your are looking for.


Update January 2016:

With Git 2.8 (Q1 2016), you will be able to fetch submodules in parallel (!) with git fetch --recurse-submodules -j2.
See "How to speed up / parallelize downloads of git submodules using git clone --recursive?"

Katelyn answered 16/8, 2010 at 21:26 Comment(2)
See also https://mcmap.net/q/12700/-git-submodule-update and #1030669: git submodule foreach git pull can also be of interest.Katelyn
Note: to clarify, 'plugins', which is not a git repo at the moment, should be made one, as a parent git repo for the submodules.Katelyn
C
-1

If you have a lot of subdirs with git repositories, you can use parallel

ls | parallel -I{} -j100 '
  if [ -d {}/.git ]; then
    echo Pulling {}
    git -C {} pull > /dev/null && echo "pulled" || echo "error :("
  else
     echo {} is not a .git directory
  fi
'
Corbicula answered 13/4, 2015 at 16:48 Comment(1)
Great one. I modified a bit to suit my style. pastebinVenola

© 2022 - 2024 — McMap. All rights reserved.