du in PowerShell?
Asked Answered
E

8

42

How can I get a du-ish analysis using PowerShell? I'd like to periodically check the size of directories on my disk.

The following gives me the size of each file in the current directory:

foreach ($o in gci)
{
   Write-output $o.Length
}

But what I really want is the aggregate size of all files in the directory, including subdirectories. Also I'd like to be able to sort it by size, optionally.

Emergent answered 15/5, 2009 at 12:0 Comment(7)
In the absence of serverfault, this would have been a good SO question. I think it now belongs on SF (where I will await the answer, since I'd like the same thing).Consequential
@John Saunders: I think this is not entirely true. Why should shell programing questions be taken to serverfault?Trimly
I have also voted to close the question as 'belongs to serverfault.com, and I have upvoted @John's comment, but I am now considering it again. If a user asks about how to program a script in shell, I would accept it as an stackoverflow question. How does it differ if the shell is powershell instead of sh?Cogent
@all: I'm wrestling with this sort of question now we have SF. This one is in the grey area. Is it a system admin question? Rather is it a question we want sysadmins to find when they Google for it? Then it belongs on SF. Will it be developers Googling for it? Belongs on SO. Developer who has to admin his own machine to some extent (like me), then I don't really know. I'd leave it here for now. Maybe I'll just start asking the OP for the context and decide from there.Consequential
serverfault is in private beta? ergo, we ARE in the absence of serverfault.Emergent
I think it will be inevitable anyway to establish a firm cross-linking solution between SF and SO for these kinds of questions. I would not say that the strict "will developers be googling for it, then here, else there" approach cuts it. People who are having the problem will be googling for it, and they are interested in the solution, not the site the solution is on. As long as it is not hugely off-topic (and this one isn't), I think it is in order to leave such a question here.Trimly
learn.microsoft.com/en-us/sysinternals/downloads/du download this app. move to some directory and configure path to the directory. after use in powershellFrogmouth
T
43

There is an implementation available at the "Exploring Beautiful Languages" blog:

"An implementation of 'du -s *' in Powershell"

function directory-summary($dir=".") { 
  get-childitem $dir | 
    % { $f = $_ ; 
        get-childitem -r $_.FullName | 
           measure-object -property length -sum | 
             select @{Name="Name";Expression={$f}},Sum}
}

(Code by the blog owner: Luis Diego Fallas)

Output:

PS C:\Python25> directory-summary

Name                  Sum
----                  ---
DLLs              4794012
Doc               4160038
include            382592
Lib              13752327
libs               948600
tcl               3248808
Tools              547784
LICENSE.txt         13817
NEWS.txt            88573
python.exe          24064
pythonw.exe         24576
README.txt          56691
w9xpopen.exe         4608
Trimly answered 15/5, 2009 at 12:6 Comment(4)
cool! I am in awe of Luis' powershell fu. according to powershell conventions, shouldn't the name of functions be verb-object ? So... summarize-directory or something, instead of directory-summary ?Emergent
Ask him. I'm just quoting him. :) But I think that was the convention.Trimly
it would be something more like get-directorysummary (there is a standard list of verbs)Precess
Get-DiskUtilization might be appropriate.Aloisia
P
34

I modified the command in the answer slightly to sort descending by size and include size in MB:

gci . | 
  %{$f=$_; gci -r $_.FullName | 
    measure-object -property length -sum |
    select  @{Name="Name"; Expression={$f}}, 
            @{Name="Sum (MB)"; 
            Expression={"{0:N3}" -f ($_.sum / 1MB) }}, Sum } |
  sort Sum -desc |
  format-table -Property Name,"Sum (MB)", Sum -autosize

Output:

PS C:\scripts> du

Name                                 Sum (MB)       Sum
----                                 --------       ---
results                              101.297  106217913
SysinternalsSuite                    56.081    58805079
ALUC                                 25.473    26710018
dir                                  11.812    12385690
dir2                                 3.168      3322298

Maybe it is not the most efficient method, but it works.

Paddock answered 10/10, 2012 at 16:23 Comment(2)
I love this. I added a simple change of "gci -r -file $_.FullName" so that it only tries to sum up files and not any old child item, which was causing some errors for meZakarias
This is fastest and best I found so far. Struggled a bit with gci . insteaf of gci $path but after find out I added this a function to profile. Works like a charm now.Leila
I
8

If you only need the total size of that path, one simplified version can be,

Get-ChildItem -Recurse ${HERE_YOUR_PATH} | Measure-Object -Sum Length
Inclose answered 21/11, 2020 at 4:28 Comment(1)
Thanks - this is the form I ended up using; further shortened to: gci -recurse ${path} | measure -sum length | select { $_.sum / 1MB }Deformity
P
4
function Get-DiskUsage ([string]$path=".") {
    $groupedList = Get-ChildItem -Recurse -File $path | Group-Object directoryName | select name,@{name='length'; expression={($_.group | Measure-Object -sum length).sum } }
    foreach ($dn in $groupedList) {
        New-Object psobject -Property @{ directoryName=$dn.name; length=($groupedList | where { $_.name -like "$($dn.name)*" } | Measure-Object -Sum length).sum }
    }
}

Mine is a bit different; I group all of the files on directoryname, then walk through that list building totals for each directory (to include the subdirectories).

Procurance answered 12/2, 2016 at 15:19 Comment(2)
this one's the best - it's gots a totalVoorhis
Fails to account for hidden and system files.Sanitary
W
4

Building on previous answers, this will work for those that want to show sizes in KB, MB, GB, etc., and still be able to sort by size. To change units, just change "MB" to desired units in both "Name=" and "Expression=". You can also change the number of decimal places to show (rounding), by changing the "2".

function du($path=".") {
    Get-ChildItem $path |
    ForEach-Object {
        $file = $_
        Get-ChildItem -File -Recurse $_.FullName | Measure-Object -Property length -Sum |
        Select-Object -Property @{Name="Name";Expression={$file}},
                                @{Name="Size(MB)";Expression={[math]::round(($_.Sum / 1MB),2)}} # round 2 decimal places
    }
}

This gives the size as a number not a string (as seen in another answer), therefore one can sort by size. For example:

PS C:\Users\merce> du | Sort-Object -Property "Size(MB)" -Descending

Name      Size(MB)
----      --------
OneDrive  30944.04
Downloads    401.7
Desktop     335.07
.vscode     301.02
Intel         6.62
Pictures      6.36
Music         0.06
Favorites     0.02
.ssh          0.01
Searches         0
Links            0
Wong answered 2/7, 2020 at 22:17 Comment(0)
C
1
Here is a simple recursive powershell script that does the job.
It is not as slick as the sysinternals du and in my example it is showing directories in C:\Program Files that have more than 200 MB in them.

Function List-DiskUsage {
    Param ($Path= '.\')
           

 if ($FirstTime ) {  
                     $lvl=0
                     $FirstTime = $false
     }
    $ar[$lvl].WholePath=$Path

    Get-ChildItem -Path $Path -Force -ErrorAction  SilentlyContinue| ForEach-Object {
       
        If (! $_.PSIsContainer ) {
          $len = [math]::Round($_.Length/1MB,2)
          $ar[$lvl].AccumSize+=$len
#          write-host    $lvl "   File          "  $_.Name   "      "   $len  "      " $_.FullName
         
        } ElseIf ($_.PSIsContainer) {
            $ar[$lvl].NumSubDirs+=1
            $lvl+=1
#write-host  $lvl "  Directory         "     $_.Name   "       "   $_.Length  "      " $_.FullName
            List-DiskUsage($Path=$_.FullName)
            $lvl-=1
  if ($ar[$lvl+1].AccumSize -gt 200 ) {
       $newlen=[math]::Round($ar[$lvl+1].AccumSize,0)

       Write-host  $newlen "`tMB     "  $ar[$lvl+1].WholePath  " Dirs= " $ar[$lvl+1].NumSubDirs 
  } 

#           add this level totals into next higher level.

            $ar[$lvl].AccumSize+=$ar[$lvl+1].AccumSize
            $ar[$lvl+1].AccumSize=0
            $ar[$lvl+1].WholePath=""
            $ar[$lvl+1].NumSubdirs=0
        } # End If-ElseIf.
    } # End ForEach-Object.
if ($lvl -eq 0 ) {
       $newlen=[math]::Round($ar[0].AccumSize,0)
       Write-host  $newlen "`tMB     "  $ar[0].WholePath  " Dirs= " $ar[0].NumSubDirs 
    }

} # End Function: List-DiskUsage.




$Global:FirstTime="True"
$Global:lvl=0

$Global:ar = @(
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
    [pscustomobject]@{WholePath='';AccumSize=0;NumSubDirs=0}
 )

List-DiskUsage("C:\Program Files")

    
Cockspur answered 13/5, 2023 at 12:27 Comment(1)
And if you want to restrict the printing to a particular depth from the start directory then Remove the block where I test -gt 200 and modify the level 0 write near the end to if ($lvl -le $lvlreq -and $ar[$lvl].AccumSize -gt 200 ) $newlen=[math]::Round($ar[$lvl].AccumSize,0) Write-host $newlen "`tMB " $ar[$lvl].WholePath " Dirs= " $ar[$lvl].NumSubDirs } and define a global variable as $Global:lvlreq=1 # (or whatever depth you want) Of course you can parameterise all of this and also pipe it to sort-object if neededCockspur
S
0

My own take using the previous answers:

function Format-FileSize([int64] $size) {
    if ($size -lt 1024)
    {
        return $size
    }
    if ($size -lt 1Mb)
    {
        return "{0:0.0} Kb" -f ($size/1Kb)
    }
    if ($size -lt 1Gb)
    {
        return "{0:0.0} Mb" -f ($size/1Mb)
    }
    return "{0:0.0} Gb" -f ($size/1Gb)
}

function du {
        param(
        [System.String]
        $Path=".",
        [switch]
        $SortBySize,
        [switch]
        $Summary
    )
    $path = (get-item ".").FullName
    $groupedList = Get-ChildItem -Recurse -File $Path | 
        Group-Object directoryName | 
            select name,@{name='length'; expression={($_.group | Measure-Object -sum length).sum } }
    $results = ($groupedList | % {
        $dn = $_
        if ($summary -and ($path -ne $dn.name)) {
            return
        }
        $size = ($groupedList | where { $_.name -like "$($dn.name)*" } | Measure-Object -Sum length).sum
        New-Object psobject -Property @{ 
            Directory=$dn.name; 
            Size=Format-FileSize($size);
            Bytes=$size` 
        }
    })
    if ($SortBySize)
        { $results = $results | sort-object -property Bytes }
    $results | more
}
Swashbuckling answered 7/2, 2021 at 12:24 Comment(1)
get-item "." should be get-item $PathSpecter
O
0

Using Get-Chiltree, is not that slow :

(Get-ChildItem -Path $path -Recurse | Measure-Object -Property Length -Sum).Sum / 1GB
Opportune answered 11/4, 2023 at 12:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.