Deploying only changed part of a website with git to ftp (svn2web for git)
Asked Answered
G

8

10

I'm having a website with many big images file. The source (as well as the images) is maintained with git. I wish to deploy that via ftp to a bluehost-like cheap server.

I do not wish to deploy all the website each time (so that I won't have to upload too many unchanged files over and over), but to do roughly the following:

  1. In a git repository, mark the last deployed revision with a tag "deployed".
  2. When I say "deploy revision X", find out which files has changed between revision X and revision tagged as deploy, and upload just them.

It is similar in spirit to svn2web. But I want that for DVCS. Mercurial alternative will be considered.

It's a pretty simple script to write, but I'd rather not to reinvent the wheel if there's some similar script on the web.

Capistrano and fab seems to know only how to push the whole revision, in their SCM integration. So I don't think I can currently use them.

Gi answered 14/5, 2009 at 20:55 Comment(1)
A nice tool which is not scm related but does similar work, by caching the ftp site sate, and pushing up only changed files, is weex.sf.netGi
H
7

The git-ftp script might be what you are looking for. It takes the changes local git repository and syncs it to a remote git repo over ftp.

I used it by hosting a git repo created using the --bare option. Put it on my ftp server.

than ran ./git-ftp.py. It prompts for ftp username, password, ftp host, local git repo path, remote git repo path (the location of the bare repository).

Then it connects to the ftp git repo and then sends the diffs alone. (it uses the git-python library to get that info needed).

The script has few issues. It seems to be prompting for username details always and I had to comment out line 68.

#ftp.voidcmd('SITE CHMOD 755 ' + node.name).

But those things can be easily fixed.

Alternative

If you are on a nix platform an alternative is to use curlftpfs. It will mount your ftp account as a device directory from which you can do all normal git operations (push, pull). Of course this solution ain't git specific.

You need to use the bare option as mentioned above on the repo shared on FTP as well as run git update-server-info within the repo before sharing it over FTP.

Caution: This isn't a good idea if you plan to have multiple users to write to your git repo. As FTP has no mechanism to LOCK access. You will end up with a corrupt repo. Test before taking to production.

Harmful answered 29/7, 2009 at 7:1 Comment(1)
Not perfect, but this could be the first step! Thanks.Gi
J
6

I've created a script called git-deploy, hope it helps.

Jansen answered 15/5, 2010 at 20:45 Comment(1)
Very nice & easy to use. Good job!Gorilla
L
2

You can store the latest deployed revision somewhere in a file, then you can simply get the name of the changed files:

$ git diff --name-only $deployed $latest

Substitute with the according sha-1 codes, or the $latest can be "master", for example.

Librate answered 14/5, 2009 at 21:21 Comment(3)
I'm quoting myself: "It's a pretty simple script to write, but I'd rather not to reinvent the wheel if there's some similar script on the web."Gi
So? There’s the list of changed files, now go and upload them.Bronwen
Is it really that simple? Will there be no need for more development on the script? Wrong I think. For example what about removed files? Those should be purged out of the ftp site. But maybe we wish to verify that the purged file wasn't changed on the server. What about file attribute which have changed (it is sometimes possible to change with FTP extention I think). Not so simple, see? So that's why I'd rather not to reinvent the wheel.Gi
E
1

Another option would be to use git archive.

Of course, as mentioned by Joey in his "git archive as distro package format":

The tricky part of using a git (or other rcs) archive as distribution source package format is handling pristine upstream tarballs.

  • One approach would be to try to create a git archive that didn't include objects present in the upstream tarball. Then, to unpack the source package, you'd unpack the upstream tarball, convert the files in it into git objects and add them into the .git directory.
    This seems like it might be possible to implement, but you'd need to know quite a lot about git internals to remove the redundant objects from the git repo and regenerate them from the tarball.

  • Another approach would be to keep the pristine upstream tarball in the git archive, and then the source package would consist entirely of the git archive. This doesn't have the same nice minimal bandwidth upload behavior -- unless you can "git push" your changes to do the upload

Storing a lot of upstream tarballs in git wouldn't be efficient, but the script pristine-tar takes care of that:

pristine-tar can regenerate a pristine upstream tarball using only a small binary delta file and a copy of the source which can be a revision control checkout.
The package also includes a pristine-gz command, which can regenerate a pristine .gz file.
The delta file is designed to be checked into revision control along-side the source code, thus allowing the original tarball to be extracted from revision control.

More details in the header of this perl script pristine-tar.

Evieevil answered 14/5, 2009 at 21:29 Comment(2)
Thanks. Not exactly what I wanted as it's archive-file-specific and not web specific. But it's pretty close. (the only drawback is, it's written in perl :-)Gi
@Elazar: true. I do not think this is the ideal solution in your case, but I thought it was worth mentioning as a possible answer for other similar (but not web specific) deployment issues.Evieevil
E
1

you might just as well use wput (wput --timestamping --reupload --dont-continue) -- like wget just for ftp uploading

Exarch answered 9/6, 2009 at 15:49 Comment(0)
T
0

Tiny BASH solution for Mercurial: hg-deploy

Tonga answered 11/9, 2014 at 12:40 Comment(0)
C
0

The command git-ftp push from git-ftp seems to work quite well.

Install it

sudo apt-get install git-ftp

After installing it, configure your ftp account

git config git-ftp.url ftp.example.net
git config git-ftp.user your-ftp-user
git config git-ftp.password your-secr3t

Then do it for the first time

git-ftp init

And then, for every change, you just need

git add -A
git commit -m "update"
git-ftp push

The files will be uploaded to the user home ~/ directory.

Commonly answered 26/6, 2017 at 13:44 Comment(0)
R
0

For Github users, you can use FTP Deploy Action.

Just add following code in /.github/workflows/main.yml.

on: push
name: 🚀 Deploy website on push
jobs:
  web-deploy:
    name: 🎉 Deploy
    runs-on: ubuntu-latest
    steps:
    - name: 🚚 Get latest code
      uses: actions/checkout@v2
    
    - name: 📂 Sync files
      uses: SamKirkland/[email protected]
      with:
        server: <ftp_server>
        username: <ftp_username>
        password: ${{ secrets.ftp_password }}

On each master push, only changed files since last time will be automatically uploaded to FTP server.

Rento answered 3/10, 2021 at 21:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.