I have an app that copies a large amount of files across the network to a file server (not web). I am trying to display a half decent estimation of the time remaining.
I have looked at a number of articles on SO an while the problem is addressed none that I have tried really do what I want. I want the estimated time remaining to be relatively stable I.E. not jump around all over the place depending on fluctuating transfer speeds.
So the first solution I looked at was to calculate the transfer speed in bytes per second
double bytePerSec = totalBytesCopied / TimeTaken.TotalSeconds;
And then divide the total byte remaining by the transfer rate.
double secRemain = (totalFileSizeToCopy - totalBytesCopied) / bytePerSec;
I figured that the time remaining would become more stable once a few MB had been copied (although expecting it to change . It doesn't, its erratic and jumps around all over the place.
Then I tried one of the solutions on SO....
double secRemain = (TimeTaken.TotalSeconds / totalBytesCopied) * (totalFileSizeToCopy - totalBytesCopied);
Which is a similar calculation but hoped it might make a difference!
So now I am kind of thinking I need to approach this from a different angle. IE Use averages? Use some kind of countdown timer and reset the time to go every so often? Just looking for opinions or preferably advice from anyone that has already had this problem.