How to improve the Performance of FtpWebRequest?
Asked Answered
M

18

34

I have an application written in .NET 3.5 that uses FTP to upload/download files from a server. The app works fine but there are performance issues:

  1. It takes a lot of time to make connection to the FTP server. The FTP server is on a different network and has Windows 2003 Server (IIS FTP). When multiple files are queued for upload, the change from one file to another creates a new connection using FTPWebRequest and it takes a lot of time (around 8-10 seconds).

  2. Is is possible to re-use the connection? I am not very sure about the KeepAlive property. Which connections are kept alive and reused.

  3. The IIS-FTP on Windows Server 2003 does not support SSL so anyone can easily see the username/password through a packet sniffer such as WireShark. I found that windows Server 2008 supports SSL over FTP in its new version if IIS 7.0.

I basically want to improve the upload/download performance of my application. Any ideas will be appreciated.

** Please note that 3 is not an issue but I would like people to have comments on it

Minimum answered 19/6, 2009 at 7:2 Comment(1)
3 isn't really a performance issue (although it is still an issue); I'd suggest tackling that separately.Anastasiaanastasie
C
41

I have done some experimentation (uploading about 20 files on various sizes) on FtpWebRequest with the following factors

KeepAlive = true/false

ftpRequest.KeepAlive = isKeepAlive;

Connnection Group Name = UserDefined or null

ftpRequest.ConnectionGroupName = "MyGroupName";

Connection Limit = 2 (default) or 4 or 8

ftpRequest.ServicePoint.ConnectionLimit = ConnectionLimit;

Mode = Synchronous or Async

see this example

My results:

  1. Use ConnectionGroupName,KeepAlive=true took (21188.62 msec)

  2. Use ConnectionGroupName,KeepAlive=false took (53449.00 msec)

  3. No ConnectionGroupName,KeepAlive=false took (40335.17 msec)

  4. Use ConnectionGroupName,KeepAlive=true;async=true,connections=2 took (11576.84 msec)

  5. Use ConnectionGroupName,KeepAlive=true;async=true,connections=4 took (10572.56 msec)

  6. Use ConnectionGroupName,KeepAlive=true;async=true,connections=8 took (10598.76 msec)

Conclusions

  1. FtpWebRequest has been designed to support an internal connection pool. To ensure, the connection pool is used, we must make sure the ConnectionGroupName is being set.

  2. Setting up a connection is expensive. If we are connecting to the same ftp server using the same credentials, having the keep alive flag set to true will minimize the number of connections setup.

  3. Asynchronous is the recommended way if you have a lot of files to ftp.

  4. The default number of connections is 2. In my environment, a connection limit of 4 will give to me the most overall performance gain. Increasing the number of connections may or may not improve performance. I would recommend that you have the connection limit as a configuration parameter so that you can tune this parameter in your environment.

Hope you would find this useful.

Clergy answered 23/6, 2010 at 1:32 Comment(2)
While I agree with most of your comments, I think your test is very questionable. "21188.62 msec"? You probably meant to say "21s" instead. Did you run this once or many times and average the results? Concluding that connections=4 gives the most gain from a difference of 30ms is not feasible. No offense intended, but the test as it is is not helpful, rather misleading. - If you could fix the issues (accuracy vs precision, multiple runs, more files, background explanations etc) this would be a great post.Chapter
I doubt that he/she typed "21188.62 msec" on accident. Maybe you meant to say "listing times in seconds would be easier to read"? And I don't think the test isn't helpful. Maybe drawing conclusions on a limited set of data is ill-advised, but the data is there if you want to use it. At the very least it's a starting point.Immensity
A
18

It doesn't matter if the individual connections take long to connect as long as you can launch many in parallel. If you have many items to transfer (say hundreds) then it makes sense to launch tens and even hundreds of WebRequests in parallel, using the asynchronous methods like BeginGetRequestStream and BeginGetResponse. I worked on projects that faced similar problems (long connect/authenticate times) but by issuing many calls in parallel the overall throughput was actually very good.

Also it makes a huge difference if you use the async methods or the synchronous one, as soon as you have many (tens, hundreds) of requests. This applies not only to your WebRequests methods, but also to your Stream read/write methods you'll use after obtaining the upload/download stream. The Improving .Net Performance and Scalability book is a bit outdated, but much of its advice still stands, and is free to read online.

One thing to consider is that the ServicePointManager class sits there lurking in the Framwework with one sole purpose: to ruin your performance. Make sure you obtain the ServicePoint of your URL and change the ConnectionLimit to a reasonable value (at least as high as how many concurrent requests you intend).

Alyshaalysia answered 25/6, 2009 at 8:7 Comment(3)
What does the ServiceManager has to do when I am just using a single connection. My application uploads multiple files one by one. What should I set the ConnectionLimit to? Also I observed that it takes a lot of time to make a connection when a connection is already active. Generally it takes 4.5 seconds for the first FTP GetRequestStream() to return. Then all subsequent connections take 1.3 seconds but if the connections overlap, it takes 12 seconds to create a connection.Minimum
If a connection is already active the second one will not connect until the first one finishes, that's very thing the ServiceManager controls (throttles connection on your behalf). If you are dedicated to use one connection and serialize all request then ServiceManager will make no difference. My point was about doing all requests in parallel.Alyshaalysia
This is a very good answer as long as the sequence that the files are retrieved in is not too important, the requests should be processed in the order that they are made, but obviously they can finish at different times depending on the size of the transfers. If you require that certain transfers finish before others (such as to control the order in which retrieved files are processed), this may not be the ideal method.Maidenhood
B
5

Debug Network

A few tricks for simple network debugging:

  1. Check the response times when you ping the FTP server from the application server.
  2. Check the response times for a trace route (tracert from a DOS shell).
  3. Transfer a file from the command-line using the ftp command.
  4. Connect to the FTP server via Telnet: telnet server 21.

The results will provide clues to solving the problem.

Network Hardware

For a slow trace route:

  • Determine why the two computers are having network issues.
  • Upgrade the network hardware between the slowest link.

Network Configuration

For a slow ping:

  • Check the network configuration on each machine.
  • Ensure the settings are optimal.

Validate API

A slow command-line FTP session will tell you that the problem is not isolated to the FTP API you are using. It does not eliminate the API as a potential problem, but certainly makes it less likely.

Network Errors

If packets are being dropped between the source and destination, ping will tell you. You might have to increase the packet size to 1500 bytes to see any errors.

FTP Queue Server

If you have no control over the destination FTP server, have an intermediary server receive uploaded files. The intermediary then sends the files to the remote server at whatever speed it can. This gives the illusion that the files are being sent quickly. However, if the files must exist on the remote server as soon as they are uploaded, then this solution might not be viable.

FTP Server Software

Use a different FTP daemon on the FTP server, such as ProFTPd as a Windows service. (ProFTPd has plug-ins for various databases that allow authentication using SQL queries.)

FTP Server Operating System

A Unix-based operating system might be a better option than a Microsoft-based one.

FTP Client Software

There are a number of different APIs for sending and receiving files via FTP. It might take some work to make your application modular enough that you can simply plug in a new file transfer service. A few different APIs are listed as answers here.

Alternate Protocol

If FTP is not an absolute requirement, try:

  • a Windows network drive
  • HTTPS
  • scp, rsync, or similar programs (Cygwin might be required)
Bandit answered 28/6, 2009 at 20:15 Comment(0)
D
4

This link describes ConnectionGroupName and KeepAlive affects: WebRequest ConnectionGroupName

Dextrogyrate answered 13/10, 2010 at 12:23 Comment(0)
M
3

You should definitely check out BITS which is a big improvement over FTP. The clear-text passwords aren't the only weakness in FTP. There's also the issue of predicting the port it will open for a passive upload or download and just overall difficulty when clients are using NAT or firewalls.

BITS works over HTTP/HTTPS using IIS extensions and supports queued uploads and downloads that can be scheduled at low priority. It's overall just a lot more flexible than FTP if you are using Windows on the client and server.

BITS for PowerShell

BITS for .NET

Methionine answered 23/6, 2009 at 8:23 Comment(0)
H
2

Personally I have migrated all of our apps away from using FTP for file upload/download, and instead rolled a solution based on XML Web Services in ASP.NET.

Performance is much improved, security is as much or as little as you want to code (and you can use the stuff built in to .NET) and it can all go over SSL with no issues.

Our success rate getting our clients' connections out through their own firewalls is FAR better than running FTP.

Hb answered 22/6, 2009 at 11:59 Comment(0)
I
2

Look at this page - http://www.ietf.org/rfc/rfc959.txt

It says of using different port when connecting to be able to reuse the connection.
Does that work?

Impenitent answered 28/6, 2009 at 19:8 Comment(0)
B
2

I strongly suggest Starksoft FTP/FTPS Component for .NET and Mono. It has a connection object that you can cache and reuse.

Blacksmith answered 15/1, 2010 at 4:34 Comment(0)
B
1

I'd recommend switching to rsync.
Pros :
Optimised for reducing transfer time.
Supports SSH for secure transfer
Uses TCP so makes your IT dept/firewall guys happier

Cons:
No native .NET support
Geared towards linux server installations - though there are decent windows ports like DeltaCopy

Overall though it's a much better choice than FTP

Bleachers answered 27/6, 2009 at 15:23 Comment(0)
T
1

I have had good results with EDT's ftp library:

http://www.enterprisedt.com/products/edtftpnet/overview.html

Thomasenathomasin answered 27/6, 2009 at 16:6 Comment(0)
A
1

To resolve the problem about performance you simply need to set:

ftpRequest.ConnectionGroupName = "MyGroupName"; ftpRequest.KeepAlive = false; ftpRequest.ServicePoint.CloseConnectionGroup("MyGroupName");

Absonant answered 4/8, 2012 at 12:18 Comment(0)
C
1

I know it's an old thread, but I recently had to go through a similar situation.

We needed to download 70+ XML files from an ftp server in under 25 minutes without opening more than 5 connections during that time-frame.

These were all the alternatives we tried:

  • wget - It was fast, but every GET meant a new connection. We had to stop due to the amount of connections created. We were having some issues with GETM that's well-documented in the web so that wasn't an option.
  • FtpWebRequest - Every Create call would log a new connection, even though we used the KeepAlive and ConnectionGroupName properties. Plus it was very slow.
  • Webclient - We didn't check if it created a new connection for every file (although I assume it does), but it copied 1 file/minute. So it wasn't an option.

We ended up using old-fashioned ftp batch script. It's fast and I only use one connection to download all the files. It isn't flexible, but it's much faster than everything else I've tried (75 files in under 20 minutes).

Calamus answered 20/9, 2013 at 22:21 Comment(0)
E
0

KeepAlive is working. FtpWebRequest caches connections inside, so they can be reused after some time. For details and explanation of this mechanism you can look to ServicePoint.

Another good source of information is to look into FtpWebRequest source (you can do it on VS2008).

Ex answered 23/6, 2009 at 8:10 Comment(0)
M
0

AFAIK, each FtpWebRequest has to set up a new connection - including logon to the server. If you want to speed up the FTP transfers, I would recommend that you use an alternate FTP client instead. Some of these alternate clients can login and then perform multiple actions using the same command connection.

Examples of such clients incldue: http://www.codeproject.com/KB/IP/FtpClient.aspx which also includes a good explanation as to why these libraries can operate faster than the standard FtpWebRequest and http://www.codeproject.com/KB/macros/ftp_class_library.aspx which looks like a simple enough implementation also.

Personally, I rolled my own implementation of FTP back in the .NET 1.1 days before the FtpWebRequest was introduced and this still works well for .NET 2.0 onwards.

Maidenhood answered 27/6, 2009 at 15:8 Comment(0)
C
0

Single Point of advice:

LOWER BUFFER/CHUNK-SIZES SIGNIFICANTLY REDUCE PERFORMANCE

Reason: Many more disk i/o, memory i/o, ftp stream init and many many more factors

Cudlip answered 21/2, 2012 at 10:51 Comment(0)
E
0

i was working a few days with that... and speed was really low, nothing to compare with FileZilla... finally i solved with multithreads. 10 threads making connections for download gives me a good rate, maybe even could be improved more.. with a standar ftprequest configuration

PeticionFTP.ConnectionGroupName = "MyGroupName"
PeticionFTP.ServicePoint.ConnectionLimit = 4
PeticionFTP.ServicePoint.CloseConnectionGroup("MyGroupName")

PeticionFTP.KeepAlive = False 
PeticionFTP.UsePassive = False

PeticionFTP.UseBinary = True

PeticionFTP.Credentials = New NetworkCredential(lHost.User, lHost.Password)
Encode answered 27/9, 2015 at 20:49 Comment(0)
X
0

I did some benchmarks on FtpWebRequest, similar to @Sid 's response above. Setting KeepAlive to true does improve, but not the asynchronous calls in my test. The test consists of

1) FtpWebRequest for check of file existence 2) FtpWebRequest for upload 3) FtpWebRequest for rename file on server

Test FTP client 30 files of size 5 Kbytes took ... 14.897 seconds Test upload (alive true, connection name) 30 files of size 5 Kbytes took ... 34.229 seconds Test async(alive true, connection name) 30 files of size 5 Kbytes took ... 40.659 seconds Test send thread (alive true, connection name) 30 files of size 5 Kbytes took ... 38.926 seconds, 30 files

what did improve was an implementation of an FTP client made using the Socket class

the benchmark is here

https://github.com/pedro-vicente/ftp_t

Xerography answered 8/5, 2018 at 18:14 Comment(0)
S
-1

Try this below code, you will get better performence:

private void Upload144_Click(object sender, EventArgs e)
{
    OpenFileDialog fileobj = new OpenFileDialog();
    fileobj.InitialDirectory = "C:\\";
    //fileobj.Filter = "Video files (*.mp4)";
    //fileobj.ShowDialog();

    if (fileobj.ShowDialog() == DialogResult.OK)
    {
        if (fileobj.CheckFileExists)
        {
            string test = Properties.Settings.Default.Connection;
            SqlConnection con = new SqlConnection(test);
            con.Open();
            string correctfilename = System.IO.Path.GetFileName(fileobj.FileName);
            SqlCommand cmd = new SqlCommand("Insert into Path(ID,Path5) VALUES   ((select isnull(MAX(id),0) + 1 from Path),'\\Videos\\" + correctfilename + "')", con);

            cmd.ExecuteNonQuery();

            string path = Application.StartupPath.Substring(0, Application.StartupPath.Length - 10);
            con.Close();

            //For Progressbar
            DataTable dt = new DataTable();
       //   SqlDataAdapter da = new SqlDataAdapter(cmd);
       //   da.Fill(dt);

            timer5.Enabled = true;

            // FOR FtpServer File Upload::
            string uploadfile = fileobj.FileName;
            string uploadFileName = new FileInfo(uploadfile).Name;

            string uploadUrl = "ftp://ftp.infotech.com/";
            FileStream fs = new FileStream(uploadfile, FileMode.Open, FileAccess.Read);
            try
            {
                long FileSize = new FileInfo(uploadfile).Length; // File size of file being uploaded.
                Byte[] buffer = new Byte[FileSize];

                fs.Read(buffer, 0, buffer.Length);

                fs.Close();
                fs = null;
                string ftpUrl = string.Format("{0}/{1}", uploadUrl, uploadFileName);
                FtpWebRequest requestObj = FtpWebRequest.Create(ftpUrl) as FtpWebRequest;
                requestObj.Method = WebRequestMethods.Ftp.UploadFile;
                requestObj.Credentials = new NetworkCredential("[email protected]", "test@123");
                Stream requestStream = requestObj.GetRequestStream();
                requestStream.Write(buffer, 0, buffer.Length);

                requestStream.Flush();
                requestObj = null;
            }
            catch (Exception ex)
            {
                //MessageBox.Show("File upload/transfer Failed.\r\nError Message:\r\n" + ex.Message, "Succeeded", MessageBoxButtons.OK, MessageBoxIcon.Information);
            }
        }
    }
}
Sontich answered 15/2, 2016 at 12:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.