My program uses HttpClient
to send a GET request to a Web API, and this returns a file.
I now use this code (simplified) to store the file to disc:
public async Task<bool> DownloadFile()
{
var client = new HttpClient();
var uri = new Uri("http://somedomain.com/path");
var response = await client.GetAsync(uri);
if (response.IsSuccessStatusCode)
{
var fileName = response.Content.Headers.ContentDisposition.FileName;
using (var fs = new FileStream(@"C:\test\" + fileName, FileMode.Create, FileAccess.Write, FileShare.None))
{
await response.Content.CopyToAsync(fs);
return true;
}
}
return false;
}
Now, when this code runs, the process loads all of the file into memory. I actually would rather expect the stream gets streamed from the HttpResponseMessage.Content
to the FileStream
, so that only a small portion of it is held in memory.
We are planning to use that on large files (> 1GB), so is there a way to achieve that without having all of the file in memory?
Ideally without manually looping through reading a portion to a byte[]
and writing that portion to the file stream until all of the content is written?
CopyToAsync
is already doing what you describe (internally it repeatedly reads a chunk of data from the response and writes it to the file until all data is transferred) it should not result in buffering the entire file to memory at once. – Endophyteresponse.Content.ReadAsStreamAsync()
and useStream.CopyToAsync
? – Perceptive