FTP to Google Storage
P

5

13

Some files get uploaded on a daily basis to an FTP server and I need those files under Google Cloud Storage. I don't want to bug the users that upload the files to install any additional software and just let them keep using their FTP client. Is there a way to use GCS as an FTP server? If not, how can I create a job that periodically picks up the files from an FTP location and puts them in GCS? In other words: what's the best and simplest way to do it?

Peele answered 19/4, 2017 at 4:36 Comment(1)
Seems like one way is to set up an FTP server on a VM and use gcsfs to connect this server to GCS, as described here ilyapimenov.com/blog/2015/01/19/ftp-proxy-to-gcs.html - does this work for you?Innervate
P
3

I have successfully set up an FTP proxy to GCS using gcsfs in a VM in Google Compute (mentioned by jkff in the comment to my question), with these instructions: http://ilyapimenov.com/blog/2015/01/19/ftp-proxy-to-gcs.html

Some changes are needed though:

Some possible problems:

  • If you can access the FTP server using the local ip, but not the remote ip, it's probably because you haven't set up the firewall rules
  • If you can access the ftp server, but are unable to write, it's probably because you need the write_enable=YES
  • If you are tying to read on the folder you created on /mnt, but get a I/O error, it's probably because the bucket in gcsfs_config is not right.

Also, your ftp client needs to use the transfer mode set to "passive".

Peele answered 24/4, 2017 at 0:53 Comment(3)
We did this but hade huge amounts of intermittent errors with all ready-made FTP solutions. The only thing that worked out in the end was pyftpdlib, which we do run on a FUSE-mounted GCS.Jaysonjaywalk
We tried that as well but we have sometimes errors of gcsfuse dropping the connection and so I wouldn't suggest you to run that for production useSimulator
The link is no more available :/Syncretism
D
13

You could write yourself an FTP server which uploads to GCS, for example based on pyftpdlib

Define a custom handler which stores to GCS when a file is received

import os
from pyftpdlib.handlers import FTPHandler
from pyftpdlib.servers import FTPServer
from pyftpdlib.authorizers import DummyAuthorizer
from google.cloud import storage

class MyHandler:
    def on_file_received(self, file):
        storage_client = storage.Client()
        bucket = storage_client.get_bucket('your_gcs_bucket')
        blob = bucket.blob(file[5:]) # strip leading /tmp/
        blob.upload_from_filename(file)
        os.remove(file)
    def on_... # implement other events

def main():
    authorizer = DummyAuthorizer()
    authorizer.add_user('user', 'password', homedir='/tmp', perm='elradfmw')

    handler = MyHandler
    handler.authorizer = authorizer
    handler.masquerade_address = add.your.public.ip
    handler.passive_ports = range(60000, 60999)

    server = FTPServer(("127.0.0.1", 21), handler)
    server.serve_forever()

if __name__ == "__main__":
    main()

I've successfully run this on Google Container Engine (it requires some effort getting passive FTP working properly) but it should be pretty simple to do on Compute Engine. According to the above configuration, open port 21 and ports 60000 - 60999 on the firewall.

To run it, python my_ftp_server.py - if you want to listen on port 21 you'll need root privileges.

Delisle answered 19/4, 2017 at 11:9 Comment(8)
Where should this file go? how do user authenticate with the ftp client (what are the host name, user, pass)?Peele
I think crazystick is suggesting that the user authenticates with the FTP server however you like, and the FTP server, which you're running, has credentials to upload the objects to GCS. So you write to FTP server, FTP server forwards that upload stream on to GCS.Mauer
Yes - look at the docs for pyftpdlib and you will find a number of options for authentication. In the example above, everyone would connect to the FTP server using username "user" and password "password", and all files get dumped in the same GCS bucket with default security. Running on Compute Engine / Container Engine gets you credentials for GCSDelisle
Thank you... I know I might be asking for too much, but where should this file go and what config changes need to be done in the VM in Compute Engine? Also, should there be any consideration for passive FTP and connecting to the external IP?Peele
I added a couple of extra config options you'll probably want to run it on GCE. To have it start automatically you would have to write a systemd service for it. That should be pretty trivial and there are plenty of resources explaining how.Delisle
Make sure you use a recent version of pyftpdlib! You want a version that includes my patch to gracefully handle I/O errors. The GCS SLA allows for failed writes, and they do happen from time to time (about once every other month for us, we write about 5 files a minute). The patch will be in version 1.5.3 which is not released yet.Jaysonjaywalk
I started integrating pyftpdlib and pyfilesystem2, only to find out that pyftpdlib is very tightly coupled to platform dependent calls such as os.path.join and os.path.realpath. They have an AbstractedFS class but they don't use it everywhere. Did anyone have success using pyftpdlib as a proxy to any storage backend?Attitudinize
I have gotten it to work on 127.0.0.1. However, I'm not able to try it externally using the masquerade_addressSerif
P
3

You could setup a cron and rsync between the FTP server and Google Cloud Storage using gsutil rsync or open source rclone tool.

If you can't run those commands on the FTP server periodically, you could mount the FTP server as a local filesystem or drive (Linux, Windows)

Platonism answered 21/4, 2017 at 20:48 Comment(2)
How exactly do you rsync as what is the bucket source?Christiniachristis
You would need to mount the bucket as a filesystem somewhere for example by using gcs-fuse cloud.google.com/storage/docs/gcs-fusePlatonism
P
3

I have successfully set up an FTP proxy to GCS using gcsfs in a VM in Google Compute (mentioned by jkff in the comment to my question), with these instructions: http://ilyapimenov.com/blog/2015/01/19/ftp-proxy-to-gcs.html

Some changes are needed though:

Some possible problems:

  • If you can access the FTP server using the local ip, but not the remote ip, it's probably because you haven't set up the firewall rules
  • If you can access the ftp server, but are unable to write, it's probably because you need the write_enable=YES
  • If you are tying to read on the folder you created on /mnt, but get a I/O error, it's probably because the bucket in gcsfs_config is not right.

Also, your ftp client needs to use the transfer mode set to "passive".

Peele answered 24/4, 2017 at 0:53 Comment(3)
We did this but hade huge amounts of intermittent errors with all ready-made FTP solutions. The only thing that worked out in the end was pyftpdlib, which we do run on a FUSE-mounted GCS.Jaysonjaywalk
We tried that as well but we have sometimes errors of gcsfuse dropping the connection and so I wouldn't suggest you to run that for production useSimulator
The link is no more available :/Syncretism
C
3

Set up a VM in the google cloud, using some *nix flavor. Set up ftp on it, and point it to a folder abc. Use google fuse to mount abc as a GCS bucket. Voila - back and forth between gcs / ftp without writing any software. (Small print: fuse rolls up and dies if you push too much data, so bounce it periodically, once a week or once a day; also you might need to set the mount or fuse to allow permissions for all users)

Cryoscopy answered 27/10, 2018 at 13:30 Comment(0)
S
1

I am using sftpgo - https://github.com/drakkan/sftpgo service, it can connect directly to GCP/AWS buckets and more. it also has a web interface for management and client use.

Samekh answered 8/5, 2024 at 16:47 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.