Copying Files over an Intermittent Network Connection
Asked Answered
D

8

15

I am looking for a robust way to copy files over a Windows network share that is tolerant of intermittent connectivity. The application is often used on wireless, mobile workstations in large hospitals, and I'm assuming connectivity can be lost either momentarily or for several minutes at a time. The files involved are typically about 200KB - 500KB in size. The application is written in VB6 (ugh), but we frequently end up using Windows DLL calls.

Thanks!

Diley answered 20/8, 2008 at 15:4 Comment(0)
O
5

I'm unclear as to what your actual problem is, so I'll throw out a few thoughts.

  • Do you want restartable copies (with such small file sizes, that doesn't seem like it'd be that big of a deal)? If so, look at CopyFileEx with COPYFILERESTARTABLE
  • Do you want verifiable copies? Sounds like you already have that by verifying hashes.
  • Do you want better performance? It's going to be tough, as it sounds like you can't run anything on the server. Otherwise, TransmitFile may help.
  • Do you just want a fire and forget operation? I suppose shelling out to robocopy, or TeraCopy or something would work - but it seems a bit hacky to me.
  • Do you want to know when the network comes back? IsNetworkAlive has your answer.

Based on what I know so far, I think the following pseudo-code would be my approach:

sourceFile = Compress("*.*");
destFile = "X:\files.zip";

int copyFlags = COPYFILEFAILIFEXISTS | COPYFILERESTARTABLE;
while (CopyFileEx(sourceFile, destFile, null, null, false, copyFlags) == 0) {
   do {
     // optionally, increment a failed counter to break out at some point
     Sleep(1000);
   while (!IsNetworkAlive(NETWORKALIVELAN));
}

Compressing the files first saves you the tracking of which files you've successfully copied, and which you need to restart. It should also make the copy go faster (smaller total file size, and larger single file size), at the expense of some CPU power on both sides. A simple batch file can decompress it on the server side.

Optician answered 21/8, 2008 at 10:54 Comment(0)
J
14

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

Jhansi answered 20/8, 2008 at 21:47 Comment(1)
Thanks for the RoboCopy and CopyFileEx suggestions. CopyFileEx in particular looks promising. Currently, the client application is using SHFileOperation to do the actual file copy.Diley
F
5

Try using BITS (Background Intelligent Transfer Service). It's the infrastructure that Windows Update uses, is accessible via the Win32 API, and is built specifically to address this.

It's usually used for application updates, but should work well in any file moving situation.

http://www.codeproject.com/KB/IP/bitsman.aspx

Firooc answered 20/8, 2008 at 15:6 Comment(0)
O
5

I'm unclear as to what your actual problem is, so I'll throw out a few thoughts.

  • Do you want restartable copies (with such small file sizes, that doesn't seem like it'd be that big of a deal)? If so, look at CopyFileEx with COPYFILERESTARTABLE
  • Do you want verifiable copies? Sounds like you already have that by verifying hashes.
  • Do you want better performance? It's going to be tough, as it sounds like you can't run anything on the server. Otherwise, TransmitFile may help.
  • Do you just want a fire and forget operation? I suppose shelling out to robocopy, or TeraCopy or something would work - but it seems a bit hacky to me.
  • Do you want to know when the network comes back? IsNetworkAlive has your answer.

Based on what I know so far, I think the following pseudo-code would be my approach:

sourceFile = Compress("*.*");
destFile = "X:\files.zip";

int copyFlags = COPYFILEFAILIFEXISTS | COPYFILERESTARTABLE;
while (CopyFileEx(sourceFile, destFile, null, null, false, copyFlags) == 0) {
   do {
     // optionally, increment a failed counter to break out at some point
     Sleep(1000);
   while (!IsNetworkAlive(NETWORKALIVELAN));
}

Compressing the files first saves you the tracking of which files you've successfully copied, and which you need to restart. It should also make the copy go faster (smaller total file size, and larger single file size), at the expense of some CPU power on both sides. A simple batch file can decompress it on the server side.

Optician answered 21/8, 2008 at 10:54 Comment(0)
M
2

I agree with Robocopy as a solution...thats why the utility is called "Robust File Copy"

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

And by default, a million retries. That should be plenty for your intermittent connection.

It also does restartable transfers and you can even throttle transfers with a gap between packets assuing you don't want to use all the bandwidth as other programs are using the same connection (/IPG switch)?.

Markmarkdown answered 2/9, 2008 at 13:51 Comment(0)
I
0

How about simply sending a hash after or before you send the file, and comparing that with the file you received? That should at least make sure you have a correct file.

If you want to go all out you could do the same process, but for small parts of the file. Then when you have all pieces, join them on the receiving end.

Imine answered 20/8, 2008 at 15:8 Comment(1)
That's exactly what BITS does.Firooc
D
0

You could use Microsoft SyncToy (free).

http://www.microsoft.com/Downloads/details.aspx?familyid=C26EFA36-98E0-4EE9-A7C5-98D0592D8C52&displaylang=en

Dimetric answered 17/9, 2008 at 5:10 Comment(0)
R
0

Hm, seems rsync does it, and does not need server/daemon/install I thought it does - just $ rsync src dst.

Rumormonger answered 17/9, 2008 at 21:39 Comment(0)
E
-1

SMS if it's available works.

Expiation answered 21/8, 2008 at 21:23 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.