How to download a file to browser from Azure Blob Storage
Asked Answered
S

4

42

I'm already successfully listing available files, but I needed to know how I could pass that file down to the browser for a user to download without necessarily saving it to the server

Here is how I get the list of files

var azureConnectionString = CloudConfigurationManager.GetSetting("AzureBackupStorageConnectString");
var containerName = ConfigurationManager.AppSettings["FmAzureBackupStorageContainer"];
if (azureConnectionString == null || containerName == null)
    return null;

CloudStorageAccount backupStorageAccount = CloudStorageAccount.Parse(azureConnectionString);
var backupBlobClient = backupStorageAccount.CreateCloudBlobClient();
var container = backupBlobClient.GetContainerReference(containerName); 
var blobs = container.ListBlobs(useFlatBlobListing: true);
var downloads = blobs.Select(blob => blob.Uri.Segments.Last()).ToList();
Sensor answered 26/5, 2015 at 19:15 Comment(0)
K
80

While blob content may be streamed through a web server, and along to the end user via browser, this solution puts load on the web server, both cpu and NIC.

An alternative approach is to provide the end user with a uri to the desired blob to be downloaded, which they may click in the html content. e.g. https://myaccount.blob.core.windows.net/mycontainer/myblob.ext.

The issue with this is if the content is private, since a uri such as the one above won't work unless using public blobs. For this, you can create a Shared Access Signature (or server-stored Policy), which then results in a hashed querystring appended to the uri. This new uri would be valid for a given length of time (10 minutes, for example).

Here's a small example of creating an SAS for a blob:

var sasConstraints = new SharedAccessBlobPolicy();
sasConstraints.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5);
sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10);
sasConstraints.Permissions = SharedAccessBlobPermissions.Read;

var sasBlobToken = blob.GetSharedAccessSignature(sasConstraints);

return blob.Uri + sasBlobToken;

Note that the start time is set to be a few minutes in the past. This is to deal with clock-drift. Here is the full tutorial I grabbed/modified this code sample from.

By using direct blob access, you will completely bypass your VM/web role instance/web site instance (reducing server load), and have your end-user pull blob content directly from blob storage. You can still use your web app to deal with permissioning, deciding which content to deliver, etc. But... this lets you direct-link to blob resources, rather than streaming them through your web server.

Kinin answered 26/5, 2015 at 20:22 Comment(6)
More about Valet key pattern here: msdn.microsoft.com/en-us/library/dn568102.aspxTundra
I've to embrace similar approach when getting error to download a file with just its URI. Thanks for the heads up David.Tilden
I appreciate this is a couple of years old now - but I'm coming up against this issue when downloading multiple files and zipping them. I provide SAS Urls for single file downloads - works great, but when a user wants to download, say 100 image files, I don't want to provide 100 SAS URls to the browser to download, I want to consolidate them in to a zip file. I want to do this without completely freezing my server but am yet to find a front end framework that supports zipping blob files using the clients resources. Any guidance will be greatly appreciatedTelescopium
How can we modify the above snippet if we have to preview the file in browser without downloading the actual file to user's local storageGlasscock
I set my blob container access level to 'public read access for containers and blobs'. However, providing the end user with a html5 download link to the blob from my website returned a 401 unauthorized error. BTW, I'm using Azure.Storarge.Blobs v12 nuget package. Maybe the SAS approach will work, but providing the URL to the blob did not work even with the public access level to the blob and container.Photodrama
@Photodrama - please post a new question, with all relevant details, vs posting a new question in a comment to another answer (the question really isn't directly related to the question I answered). Also, my answer is from 7 years ago, with a completely different version of the SDK.Kinin
S
13

Once the user clicks a file the server responds with this

var blob = container.GetBlobReferenceFromServer(option);

var memStream = new MemoryStream();
blob.DownloadToStream(memStream);

Response.ContentType = blob.Properties.ContentType;
Response.AddHeader("Content-Disposition", "Attachment;filename=" + option);
Response.AddHeader("Content-Length", blob.Properties.Length.ToString());
Response.BinaryWrite(memStream.ToArray());

HUGE thanks to Dhananjay Kumar for this solution

Sensor answered 26/5, 2015 at 19:15 Comment(4)
So, by doing this, you do realize that the entire contents of the blob will route through your server, right? That is, the contents of the blob will travel from blob storage to your VM/website/web role instance, then through to your end-user, via IIS / OWIN / etc.?Kinin
what would you recommend? I can't give my end users access to the entire storage, so an Azure storage explorer wouldn't work.Sensor
I posted an alternative answer.Kinin
I found this was slow as the first byte wont go to the browser until the last byte has come from blob storage. Use Andy's two line answer instead for less memory overhead and a 10ms latency!Inequality
M
12

If you use ASP.NET (core), you can stream the content to the browser without saving the file on the server and using FileStreamResult which is IActionResult would be more elegant solution.

var stream = await blob.OpenReadAsync();
return File(stream, blob.Properties.ContentType, option);
Mccollough answered 20/12, 2018 at 16:36 Comment(6)
This is the best solution if you don't want to risk your SAS tokens getting stolen, they are just in the query string after all so are not encrypted even with https. Also if you are streaming secure videos set EnableRangeProcessing on the File object before returning it and it will let the browser skip through the video without downloading the whole file!Inequality
@DanielBailey I disagree, this will make the whole blob be unnecessarily downloaded to your application, thus consuming bandwidth of your servers, also making the process slower. You can set expiry date on SAS tokens like David's answer, also you can't just stole a token and read anything, the token appended to the Uri is for that blob only, that means anyone with that Uri could do exactly the same with a Uri from your application that would execute Andy's code above.Dene
It is a bit magical but the whole blob is not downloaded, only the range, it is like the blob storage knows you only want a part of the file and streams only that part to the host, then the host forwards it on with only a tiny buffer.Inequality
You would need a pretty small expiry date on the SAS tokens to prevent anyone who is doing basic logging of what goes through an access point from not immediately downloading whatever it is, and you cant have it too short as the browser needs time to do the round trip and make the request back out direct to blob storage. At least if you do a POST under https they will need to decrypt the message to get at the necessary keys to get at the blob, much safer I think but I am no expert on SAS token protection. There must be a safe way of doing it, a GET is not it though.Inequality
@Alisson you are forgetting about scenario in which you need to authorize downloads. SAS token doesn't give you this. Someone logged in your application can give away those links with SAS token and everyone will be able to download the file. When it goes through your endpoint anauthenticated/unoathorized users will receive 401/403.Bushbuck
I'm using .NET 6 (Core) and this is solution works whereas providing a link to the blob w/ public access level gives me a 401 unauthorized error (even though the blob and container have public access level). BTW, I am using Azure.Storage.Blobs v12 nuget package. Also, I haven't tried SAS method yet.Photodrama
L
1

I have done a sample where you can upload and download blob file.

using System;
using System.Threading.Tasks;
using System.IO;
using Microsoft.Azure.Storage;
using Microsoft.Azure.Storage.Blob;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using System.Linq;
using System.Collections.Generic;

namespace GetBackup
{
    class Program
    {
        static async Task Main(string[] args)
        {
            string Config_string = "";

            using (StreamReader SourceReader = File.OpenText(@"appsettings.json"))
            {
                Config_string = await SourceReader.ReadToEndAsync();
            }

            var config = (JObject)JsonConvert.DeserializeObject(Config_string);

            if(config["Application_type"].ToString()== "Backup")
            {
                string Dir_path = config["Backup_Path"].ToString();
                string[] allfiles = Directory.GetFiles(Dir_path, "*.*", SearchOption.AllDirectories);


                string storageConnectionString = config["AZURE_STORAGE_CONNECTION_STRING"].ToString();
                CloudStorageAccount storageAccount;
                if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
                {
                    CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
                    CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("rtddata");
                    //await cloudBlobContainer.CreateAsync();

                    string[] ExcludeFiles = config["Exception_File"].ToString().Split(',');

                    foreach (var file in allfiles)
                    {
                        FileInfo info = new FileInfo(file);
                        if (!ExcludeFiles.Contains(info.Name))
                        {
                            string folder = (Dir_path.Length < info.DirectoryName.Length) ? info.DirectoryName.Replace(Dir_path, "") : "";
                            folder = (folder.Length > 0) ? folder + "/" : "";
                            CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(folder + info.Name);
                            await cloudBlockBlob.UploadFromFileAsync(info.FullName);
                        }

                    }

                }
            }
            
            else if (config["Application_type"].ToString() == "Restore")
            {
                string storageConnectionString = config["AZURE_STORAGE_CONNECTION_STRING"].ToString();
                CloudStorageAccount storageAccount;
               
                if (CloudStorageAccount.TryParse(storageConnectionString, out storageAccount))
                {
                    CloudBlobClient cloudBlobClient = storageAccount.CreateCloudBlobClient();
                    CloudBlobContainer cloudBlobContainer = cloudBlobClient.GetContainerReference("rtddata");
                    string Dir_path = config["Restore_Path"].ToString();

                    IEnumerable<IListBlobItem> results = cloudBlobContainer.ListBlobs(null,true);  
                    foreach (IListBlobItem item in results)
                    {
                        string name = ((CloudBlockBlob)item).Name;
                        if (name.Contains('/'))
                        {
                            string[] subfolder = name.Split('/');
                            if (!Directory.Exists(Dir_path + subfolder[0]))
                            {
                                Directory.CreateDirectory(Dir_path + subfolder[0]);
                            }
                            
                        }  
                            CloudBlockBlob blockBlob = cloudBlobContainer.GetBlockBlobReference(name);
                            string path = (Dir_path + name);
                            blockBlob.DownloadToFile(path, FileMode.Create);
                    }
                    

                }
                    
            }

            
            
        }
    }
}
Load answered 22/8, 2020 at 14:1 Comment(1)
Hi Sir, I'm using almost the same "Restore" code for download a file and it works perfectly. The problem is that the file I download is used as a tmp file, so, I need to delete it, with this code it got me an error due to dispose. Maybe could you help me please with this issueWasteland

© 2022 - 2024 — McMap. All rights reserved.