I have to transfer large files between computers on via unreliable connections using WCF.
Because I want to be able to resume the file and I don't want to be limited in my filesize by WCF, I am chunking the files into 1MB pieces. These "chunk" are transported as stream. Which works quite nice, so far.
My steps are:
- open filestream
- read chunk from file into byte[] and create memorystream
- transfer chunk
- back to 2. until the whole file is sent
My problem is in step 2. I assume that when I create a memory stream from a byte array, it will end up on the LOH and ultimately cause an outofmemory exception. I could not actually create this error, maybe I am wrong in my assumption.
Now, I don't want to send the byte[] in the message, as WCF will tell me the array size is too big. I can change the max allowed array size and/or the size of my chunk, but I hope there is another solution.
My actual question(s):
- Will my current solution create objects on the LOH and will that cause me problem?
- Is there a better way to solve this?
Btw.: On the receiving side I simple read smaller chunks from the arriving stream and write them directly into the file, so no large byte arrays involved.
Edit:
current solution:
for (int i = resumeChunk; i < chunks; i++)
{
byte[] buffer = new byte[chunkSize];
fileStream.Position = i * chunkSize;
int actualLength = fileStream.Read(buffer, 0, (int)chunkSize);
Array.Resize(ref buffer, actualLength);
using (MemoryStream stream = new MemoryStream(buffer))
{
UploadFile(stream);
}
}