I think @Aredridel's post was closest, but there's a bit more to that - so I will add this here; the thing is, in svn
, if you're in a subfolder of a repo, and you do:
/media/disk/repo_svn/subdir$ svn export . /media/disk2/repo_svn_B/subdir
then svn
will export all files that are under revision control (they could have also freshly Added; or Modified status) - and if you have other "junk" in that directory (and I'm not counting .svn
subfolders here, but visible stuff like .o
files), it will not be exported; only those files registered by the SVN repo will be exported. For me, one nice thing is that this export also includes files with local changes that have not been committed yet; and another nice thing is that the timestamps of the exported files are the same as the original ones. Or, as svn help export
puts it:
- Exports a clean directory tree from the working copy specified by
PATH1, at revision REV if it is given, otherwise at WORKING, into
PATH2. ... If REV is not specified, all local
changes will be preserved. Files not under version control will
not be copied.
To realize that git
will not preserve the timestamps, compare the output of these commands (in a subfolder of a git
repo of your choice):
/media/disk/git_svn/subdir$ ls -la .
... and:
/media/disk/git_svn/subdir$ git archive --format=tar --prefix=junk/ HEAD | (tar -t -v --full-time -f -)
... and I, in any case, notice that git archive
causes all the timestamps of the archived file to be the same! git help archive
says:
git archive behaves differently when given a tree ID versus when given a commit ID or tag ID. In the first case the
current time is used as the modification time of each file in the archive. In the latter case the commit time as recorded
in the referenced commit object is used instead.
... but apparently both cases set the "modification time of each file"; thereby not preserving the actual timestamps of those files!
So, in order to also preserve the timestamps, here is a bash
script, which is actually a "one-liner", albeit somewhat complicated - so below it is posted in multiple lines:
/media/disk/git_svn/subdir$ git archive --format=tar master | (tar tf -) | (\
DEST="/media/diskC/tmp/subdirB"; \
CWD="$PWD"; \
while read line; do \
DN=$(dirname "$line"); BN=$(basename "$line"); \
SRD="$CWD"; TGD="$DEST"; \
if [ "$DN" != "." ]; then \
SRD="$SRD/$DN" ; TGD="$TGD/$DN" ; \
if [ ! -d "$TGD" ] ; then \
CMD="mkdir \"$TGD\"; touch -r \"$SRD\" \"$TGD\""; \
echo "$CMD"; \
eval "$CMD"; \
fi; \
fi; \
CMD="cp -a \"$SRD/$BN\" \"$TGD/\""; \
echo "$CMD"; \
eval "$CMD"; \
done \
)
Note that it is assumed that you're exporting the contents in "current" directory (above, /media/disk/git_svn/subdir
) - and the destination you're exporting into is somewhat inconveniently placed, but it is in DEST
environment variable. Note that with this script; you must create the DEST
directory manually yourself, before running the above script.
After the script is ran, you should be able to compare:
ls -la /media/disk/git_svn/subdir
ls -la /media/diskC/tmp/subdirB # DEST
... and hopefully see the same timestamps (for those files that were under version control).
Hope this helps someone,
Cheers!
git archive --format zip --output "output.zip" master -0
will give you an uncompressed archive (-0 is the flag for uncompressed). git-scm.com/docs/git-archive. – Maizeexport
a 250 kB subdirectory directly from remote repository (which could otherwise be 200 MB in size, excluding revisions) - and I will only hit the network for 250 kB (or so) download transfer. Withgit
,archive
has to be enabled on server (so I can't try it) -clone --depth 1
from server may still retrieve a repo of say 25 MB, where the.git
subfolder alone takes 15MB. Therefore, I'd still say answer is "no". – Submaxillarygit checkout-index
– Abeyantgit archive -o latest.zip HEAD
– Alkyl