How to track progress of async file upload to azure storage
Asked Answered
B

5

9

Is there way to track the file upload progress to an Azure storage container? I am trying to make a console application for uploading data to Azure using C#.

My current code looks like this:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using System.Configuration;
using System.IO;
using System.Threading;

namespace AdoAzure
{
    class Program
    {
        static void Main(string[] args)
        {
            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
            ConfigurationManager.ConnectionStrings["StorageConnectionString"].ConnectionString);
            CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = blobClient.GetContainerReference("adokontajnerneki");
            container.CreateIfNotExists();
            CloudBlobClient myBlobClient = storageAccount.CreateCloudBlobClient();
            CloudBlockBlob myBlob = container.GetBlockBlobReference("racuni.adt");
            CancellationToken ca = new CancellationToken();
            var ado = myBlob.UploadFromFileAsync(@"c:\bo\racuni.adt", FileMode.Open, ca);
            Console.WriteLine(ado.Status); //Does Not Help Much
            ado.ContinueWith(t =>
            {
                Console.WriteLine("It is over"); //this is working OK
            });
            Console.WriteLine(ado.Status); //Does Not Help Much
            Console.WriteLine("theEnd");
            Console.ReadKey();
        }
    }
}

This piece of code works well, but I'd love to have some kind of progress bar, so users can see that there are running tasks. Is there something built into WindowsAzure.Storage.Blob namespace that I can use, like a rabbit from a hat?

Brazzaville answered 16/1, 2014 at 23:48 Comment(0)
H
16

I don't think it's possible because uploading file is a single task and even though internally the file is split into multiple chunks and these chunks get uploaded, the code actually wait for the entire task to finish.

One possibility would be manually split the file into chunks and upload those chunks asynchronously using PutBlockAsync method. Once all chunks are uploaded, you can then call PutBlockListAsync method to commit the blob. Please see the code below which accomplishes that:

using Microsoft.WindowsAzure.Storage;
using Microsoft.WindowsAzure.Storage.Auth;
using Microsoft.WindowsAzure.Storage.Blob;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;

namespace ConsoleApplication1
{
    class Program
    {
        static CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials("accountname", "accountkey"), true);
        static void Main(string[] args)
        {
            CloudBlobClient myBlobClient = storageAccount.CreateCloudBlobClient();
            myBlobClient.SingleBlobUploadThresholdInBytes = 1024 * 1024;
            CloudBlobContainer container = myBlobClient.GetContainerReference("adokontajnerneki");
            //container.CreateIfNotExists();
            CloudBlockBlob myBlob = container.GetBlockBlobReference("cfx.zip");
            var blockSize = 256 * 1024;
            myBlob.StreamWriteSizeInBytes = blockSize;
            var fileName = @"D:\cfx.zip";
            long bytesToUpload = (new FileInfo(fileName)).Length;
            long fileSize = bytesToUpload;

            if (bytesToUpload < blockSize)
            {
                CancellationToken ca = new CancellationToken();
                var ado = myBlob.UploadFromFileAsync(fileName, FileMode.Open, ca);
                Console.WriteLine(ado.Status); //Does Not Help Much
                ado.ContinueWith(t =>
                {
                    Console.WriteLine("Status = " + t.Status);
                    Console.WriteLine("It is over"); //this is working OK
                });
            }
            else
            {
                List<string> blockIds = new List<string>();
                int index = 1;
                long startPosition = 0;
                long bytesUploaded = 0;
                do
                {
                    var bytesToRead = Math.Min(blockSize, bytesToUpload);
                    var blobContents = new byte[bytesToRead];
                    using (FileStream fs = new FileStream(fileName, FileMode.Open))
                    {
                        fs.Position = startPosition;
                        fs.Read(blobContents, 0, (int)bytesToRead);
                    }
                    ManualResetEvent mre = new ManualResetEvent(false);
                    var blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(index.ToString("d6")));
                    Console.WriteLine("Now uploading block # " + index.ToString("d6"));
                    blockIds.Add(blockId);
                    var ado = myBlob.PutBlockAsync(blockId, new MemoryStream(blobContents), null);
                    ado.ContinueWith(t =>
                    {
                        bytesUploaded += bytesToRead;
                        bytesToUpload -= bytesToRead;
                        startPosition += bytesToRead;
                        index++;
                        double percentComplete = (double)bytesUploaded / (double)fileSize;
                        Console.WriteLine("Percent complete = " + percentComplete.ToString("P"));
                        mre.Set();
                    });
                    mre.WaitOne();
                }
                while (bytesToUpload > 0);
                Console.WriteLine("Now committing block list");
                var pbl = myBlob.PutBlockListAsync(blockIds);
                pbl.ContinueWith(t =>
                {
                    Console.WriteLine("Blob uploaded completely.");
                });
            }
            Console.ReadKey();
        }
    }
}
Hellenist answered 17/1, 2014 at 9:48 Comment(2)
Wizard!!! Thank you so much. After weeks of research this helped tremendously in solving the final piece to my problem.Misapprehend
Just noting here that you shouldn't open a file stream many times like that inside the loop - the filestream should be opened once outside the loop.Ganger
G
4

Gaurav's solution works well and is very similar to http://blogs.msdn.com/b/kwill/archive/2011/05/30/asynchronous-parallel-block-blob-transfers-with-progress-change-notification.aspx. The challenge with this code is that you are doing a lot of complex work with very little error handling. I am not saying there is anything wrong with Gaurav's code - it looks solid - but especially with network related communication code there are lots of variables and lots of issues that you have to account for.

For this reason I modified my original blog to use the upload code from the storage client library (under the assumption that the code coming from the Azure Storage team was more robust than anything I could write) and track progress using a ProgressStream class. You can see the updated code at http://blogs.msdn.com/b/kwill/archive/2013/03/06/asynchronous-parallel-block-blob-transfers-with-progress-change-notification-2-0.aspx.

Gelatinate answered 17/1, 2014 at 17:31 Comment(3)
+1 @kwill. I agree that we have to deal with situations when the upload process fails which I have not taken into consideration in my code above. Also the code uploads one block at a time however in production code one may want to upload multiple blocks in parallel.Hellenist
You should put examples from that page here, not just a link. @GauravMantri answer is short and nice. That link is hideous.Wooster
Please note that, CloudBlobClient.SingleBlobUploadThresholdInBytes Property is obsolete now. Use DefaultRequestOptions.SingleBlobUploadThresholdInBytes.Shofar
H
3

How about this.

public class ObservableFileStream : FileStream
{
    private Action<long> _callback;

    public ObservableFileStream(String fileName, FileMode mode, Action<long> callback) : base(fileName, mode)
    {
        _callback = callback;
    }

    public override void Write(byte[] array, int offset, int count)
    {
        _callback?.Invoke(Length);
        base.Write(array, offset, count);
    }

    public override int Read(byte[] array, int offset, int count)
    {
        _callback?.Invoke(Position);
        return base.Read(array, offset, count);
    }
}
public class Test
{
    private async void Upload(String filePath, CloudBlockBlob blob)
    {
        ObservableFileStream fs = null;

        using (fs = new ObservableFileStream(filePath, FileMode.Open, (current) =>
        {
            Console.WriteLine("Uploading " + ((double)current / (double)fs.Length) * 100d);
        }))
        {
            await blob.UploadFromStreamAsync(fs);
        }
    }
}
Histolysis answered 14/11, 2018 at 10:0 Comment(0)
H
1

The below code uses Azure Blob Storage SDK v12. Reference link: https://www.craftedforeveryone.com/upload-or-download-file-from-azure-blob-storage-with-progress-percentage-csharp/

public void UploadBlob(string fileToUploadPath)
{
    var file = new FileInfo(fileToUploadPath);
    uploadFileSize = file.Length; //Get the file size. This is need to calculate the file upload progress

    //Initialize a progress handler. When the file is being uploaded, the current uploaded bytes will be published back to us using this progress handler by the Blob Storage Service
    var progressHandler = new Progress();
    progressHandler.ProgressChanged += UploadProgressChanged;

    var blob = new BlobClient(connectionString, containerName, file.Name); //Initialize the blob client
    blob.Upload(fileToUploadPath, progressHandler: progressHandler); //Make sure to pass the progress handler here

}

private void UploadProgressChanged(object sender, long bytesUploaded)
{
    //Calculate the progress and update the progress bar.
    //Note: the bytes uploaded published back to us is in long. In order to calculate the percentage, the value has to be converted to double. 
    //Auto type casting from long to double happens here as part of function call
    Console.WriteLine(GetProgressPercentage(uploadFileSize, bytesUploaded));
}

private double GetProgressPercentage(double totalSize,double currentSize)
{
    return (currentSize / totalSize) * 100;
}
Harri answered 18/9, 2020 at 11:22 Comment(1)
how would you send the progress to the front end? Console.WriteLine won't work...Numismatist
F
0

Use BlockBlobClient instead of BlobClient (using Azure.Storage.Blobs.Specialized;)

private const string StorageConnectionString = "<Your storage connection string>";
private const string StorageContainerName = "<Your storage container name>";

[HttpPost]
public async Task<IActionResult> UploadFiles(IList<IFormFile> files)
{
    foreach (var file in files)
    {
        var blockBlobClient = new BlockBlobClient(StorageConnectionString, StorageContainerName, file.FileName); // Add Guid to FileName to ensure uniqueness

        using var output = blockBlobClient.OpenWrite(overwrite: true);
        using var input = file.OpenReadStream();

        var buffer = new byte[64 * 1024];
        var totalReadBytes = 0L;
        var readBytes = 0;

        while ((readBytes = input.Read(buffer, 0, buffer.Length)) > 0)
        {
            await output.WriteAsync(buffer, 0, readBytes);

            totalReadBytes += readBytes;

            //progress = Math.Round((totalReadBytes * 1d / sessionUploadFileInfo.FileLength * 100d), 2, MidpointRounding.AwayFromZero);
            //do what you do with the progress, save it in session or however you display it to the user...
            //With WebAPI, I use a static dictionary and a client Id to store file upload progress
            //A separate REST request with client Id gets the file progress' from the dictionary
        }
    }
    return Content("success");
}
Flin answered 8/3, 2021 at 10:46 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.