Check space used in azure storage accounts in my subscription
Asked Answered
B

10

32

How can I check how much space I used in each of my azure storage accounts in my subscription resource group wise.

I am not able to find a way to check space used in azure storage account through PowerShell, CLI, portal...

Blondie answered 26/4, 2017 at 9:1 Comment(1)
Looking at billing will show you daily charges, which is ultimately what really matters. It may reveal other surprises and you can sort all your resources by cost. Also note that the newer version 2 pricing is (mostly) cheaper than version 1 so check you're on the latest version.Chronological
V
22

Azure Storage size consist of all of 4 services (Blob, Queue,File, Table) altogether. Based on my knowledge, there are no ways to calculate the total size of all services for now.

However, you could get blob space used on Portal by using Azure metrics. Please select Monitor-->Metrics

enter image description here

More information about monitor a storage account in the Azure portal please refer to this link.

Also, you could use PowerShell to get your blob usage. There is a good script you could use.

Vernon answered 26/4, 2017 at 9:24 Comment(3)
your answer will only provide information about Blob Storage Size. It will not include the space consumed by Tables, Files & Queues in that storage account.Vorfeld
The same section provides capacity information on the other storage types.Perennate
You can all do this at the level of the storage account itselfChronological
K
14

I found this article so relevant after a lots of search:

https://techcommunity.microsoft.com/t5/azure-paas-blog/calculate-the-size-capacity-of-storage-account-and-it-services/ba-p/1064046

Use Azure Monitor to check the capacity of the storage accounts. Steps:

  • Navigate to Azure Monitor
  • Click on Storage Accounts
  • Click on the Capacity. You can see all the accounts and used capacity side-by-side here.

enter image description here

Keavy answered 12/3, 2021 at 18:47 Comment(0)
C
12

Azure Storage Explorer has a 'Directory Statistics' button.

Navigate to a folder

enter image description here

Click the button

enter image description here

The total is shown in the activities panel

enter image description here

Chronological answered 22/8, 2018 at 23:54 Comment(3)
For blobs it's called 'Folder Statistics'. This example is for File Shares.Chronological
If i use this method (Azure Storage Explorer) it shows about 2TiB but when using Monitor y Azure portal (https://mcmap.net/q/448368/-check-space-used-in-azure-storage-accounts-in-my-subscription) it shows about 6TiB in the last 4 hours. Why this difference? i guess the last amount it is what i am charged forUsm
If you see 'in the last 4 hours' that implies to me maybe it's bandwidth you're looking at and not size on disk? Another possibility - and this is a complete guess, but maybe if you have many small files there is a minimum block size that each file can be on disk and that larger size would be what you're charged for.Chronological
H
5

Here is a .net core script I use to list storage account usage using the average metrics value of the last hour.

using System;
using System.Collections.Generic;
using System.IO;
using System.Threading;
using Microsoft.Azure.Management.CosmosDB.Fluent.Models;
using Microsoft.Azure.Management.Fluent;
using Microsoft.Azure.Management.Monitor;
using Microsoft.Azure.Management.Monitor.Models;
using Microsoft.Rest.Azure.Authentication;

namespace storagelist
{
    class Program
    {
        static async System.Threading.Tasks.Task Main(string[] args)
        {
            // to generate my.azureauth file run the follow command:
            // az ad sp create-for-rbac --sdk-auth > my.azureauth
            var azure = Azure.Authenticate("my.azureauth").WithDefaultSubscription();

            var accounts = azure.StorageAccounts.List();
            // can get values from my.azureauth
            var tenantId = "";
            var clientId = "";
            var clientSecret = "";
            var serviceCreds = await ApplicationTokenProvider.LoginSilentAsync(tenantId, clientId, clientSecret);
            MonitorManagementClient readOnlyClient = new MonitorManagementClient(serviceCreds);

            var oneHour = System.TimeSpan.FromHours(1);
            var startDate = DateTime.Now.AddHours(-oneHour.Hours).ToUniversalTime().ToString("o");
            string endDate = DateTime.Now.ToUniversalTime().ToString("o");
            string timeSpan = startDate + "/" + endDate;

            List<string> fileContents = new List<string>();

            foreach (var storage in accounts)
            {
                var response = await readOnlyClient.Metrics.ListAsync(
                resourceUri: storage.Id,
                timespan: timeSpan,
                interval: oneHour,
                metricnames: "UsedCapacity",

                aggregation: "Average",
                resultType: ResultType.Data,
                cancellationToken: CancellationToken.None);

                foreach (var metric in response.Value)
                {
                    foreach (var series in metric.Timeseries)
                    {
                        foreach (var point in series.Data)
                        {
                            if (point.Average.HasValue)
                            {
                                fileContents.Add($"{storage.Id}, {point.Average.Value}");
                                break;
                            }
                        }
                        break;
                    }
                    break;
                }
            }

            await File.WriteAllLinesAsync("./storage.csv", fileContents);
        }
    }
}
Hone answered 30/1, 2019 at 14:9 Comment(1)
Amazing I didn't know I wanted this until i saw this. Thanks, ill be using this.Capacious
N
3

This will give storage account capacity in each resource group in all subscriptions

$sub = Get-AzSubscription | select Name
$sub | foreach { 
Set-AzContext -Subscription $_.Name
$currentSub = $_.Name
$RGs = Get-AzResourceGroup | select ResourceGroupName
$RGs | foreach {
$CurrentRG = $_.ResourceGroupName
$StorageAccounts = Get-AzStorageAccount -ResourceGroupName $CurrentRG | select StorageAccountName
$StorageAccounts | foreach {
$StorageAccount = $_.StorageAccountName
$CurrentSAID = (Get-AzStorageAccount -ResourceGroupName $CurrentRG -AccountName $StorageAccount).Id
$usedCapacity = (Get-AzMetric -ResourceId $CurrentSAID -MetricName "UsedCapacity").Data
$usedCapacityInMB = $usedCapacity.Average / 1024 / 1024
"$StorageAccount,$usedCapacityInMB,$CurrentRG,$currentSub" >> ".\storageAccountsUsedCapacity.csv"
}
}
}

Output

Nuss answered 8/1, 2020 at 15:1 Comment(0)
E
2

You can go to: Home > {storage account} > {container} > properties Under properties you will have calculate size Container Size

Eurhythmic answered 30/3, 2021 at 12:41 Comment(0)
W
1

I have created python script to calculate used storage in all subscriptions. Well, it''s not quickly:

  • need to request all subscriptions via provided permissions
  • request Azure resource Graph to receive list /subscription/resourcegroup/storageaccount
  • generate list of subscription, where storageaccount exists
  • request Azure Monitor for every /subscription/resourcegroup/storageaccount to receive UsedCapacity
    from azure.mgmt.monitor import MonitorManagementClient
    from azure.mgmt.subscription import SubscriptionClient
    from msrestazure.azure_active_directory import ServicePrincipalCredentials
    from azure.mgmt.resourcegraph import ResourceGraphClient
    from azure.mgmt.resourcegraph.models import QueryRequest
    
    credentials = ServicePrincipalCredentials(client_id, secret, tenant=tenant_id)
    
    sub_object = SubscriptionClient(credentials)
    rgraph_object = ResourceGraphClient(credentials)
    storageaccount_pattern = "resources | where type == 'microsoft.storage/storageaccounts' | project id"
    
    subs = [sub.as_dict() for sub in sub_object.subscriptions.list()]
    
    subs_list = []
    for sub in subs:
        subs_list.append(sub.get('subscription_id'))
    
    request_storageaccount = QueryRequest(subscriptions=subs_list, query=storageaccount_pattern)
    rgraph_storageaccount = rgraph_object.resources(request_storageaccount).as_dict()
    
    resource_ids = []
    
    for element in rgraph_storageaccount['data']:
        resource_ids.append(element['id'])
    
    count_used_storage = 0
    for resource_id in resource_ids:
        sub = (resource_id.split('/'))[2]
        monitor_object = MonitorManagementClient(credentials, subscription_id=sub)
        metrics_data = monitor_object.metrics.list(resource_id)
    
        for item in metrics_data.value:
            for timeserie in item.timeseries:
                for data in timeserie.data:
                    try:
                        count_used_storage = count_used_storage + data.average
                    except:
                        pass
    
    print(count_used_storage)

For ~400 subscriptions, ~1100 storageaccounts script works about 600 secs.

For one subscription it's much faster :)

Wahkuna answered 12/4, 2020 at 17:59 Comment(0)
K
1

using Cloud Shell is one of the best solution so far:

  • Add a powershell file in Cloud Shell (check out the code below)
  • Run the PS command with Storage-Account and Resource-Group names

Output

Code

param($resourceGroup, $storageAccountName)


# usage
# Get-StorageAccountSize -resourceGroup <resource-group> -storageAccountName <storage-account-name>


# Connect to Azure
Connect-AzureRmAccount


# Get a reference to the storage account and the context
$storageAccount = Get-AzureRmStorageAccount `
-ResourceGroupName $resourceGroup `
-Name $storageAccountName
$ctx = $storageAccount.Context

# Get All Blob Containers
$AllContainers = Get-AzureStorageContainer -Context $ctx
$AllContainersCount = $AllContainers.Count
Write-Host "We found '$($AllContainersCount)' containers. Processing size for each one"

# Zero counters
$TotalLength = 0
$TotalContainers = 0

# Loop to go over each container and calculate size
Foreach ($Container in $AllContainers){
$TotalContainers = $TotalContainers + 1
Write-Host "Processing Container '$($TotalContainers)'/'$($AllContainersCount)'"
$listOfBLobs = Get-AzureStorageBlob -Container $Container.Name -Context $ctx

# zero out our total
$length = 0

# this loops through the list of blobs and retrieves the length for each blob and adds it to the total
$listOfBlobs | ForEach-Object {$length = $length + $_.Length}
$TotalLength = $TotalLength + $length
}
# end container loop

#Convert length to GB
$TotalLengthGB = $TotalLength /1024 /1024 /1024

# Result output
Write-Host "Total Length = " $TotallengthGB "GB"

https://gist.github.com/iamsunny/8718fb29146363af11da95e5eb82f245

Keavy answered 14/3, 2021 at 17:9 Comment(0)
P
1

The Portal Storage Browser will show you the total data stored

The Portal Storage Browser will show you the total data stored.

  1. Login in to Azure.

  2. Navigate to the Storage Account.

  3. Click on Storage Browser on the left.

Plumbiferous answered 19/3, 2023 at 23:19 Comment(0)
L
0

To get this in Powershell, it is kind of a pain, but might be useful for other folks (such as cleaning out old backups): Here's what I came up with, and it should work with AzureRM module 6.13.0 at least:

$azstorcontext = New-AzureStorageContext -StorageAccountName storageaccounthere -StorageAccountKey storageaccountkeyhere
$sizesOverall = @()
$containers = Get-AzureStorageContainer -Context $azstorcontext
foreach ($container in $containers)
{
    Write-Output $container.Name
    $contblobs = get-azurestorageblob -container $container.name -Context $azstorcontext
    Write-Output "  Blobs: $($contblobs.count)"
    $containersize = ($contblobs | Measure-Object -Sum Length).Sum
    Write-Output "    Container Size: $containersize) (bytes)"
    $sizesOverall    
}
Liar answered 9/12, 2019 at 20:9 Comment(1)
The only problem with that script is that you get all the blobs from the SA containers, and when you do that in multiple SAs with blobs of dozens of terabytes, the script hangs for hours and sometimes it times-out.Satori

© 2022 - 2024 — McMap. All rights reserved.