If you want to know the total size of all files in a particular folder, you can easily do this in Explorer. However, if you need distinctive information like the size of a certain file type, PowerShell gives you more options. Measure-Object and Get-ChildItem are usually the tools of choice for the task.

At a command prompt, you can determine directory sizes with this command:

dir Downloads | find "files(s)"

But if you want to list file sizes in directory trees, this method won't work, and you will need tools such as Sysinternals disk usage (du).

In PowerShell, Get-ChildItem (alias gci) and Measure-Object usually solve the problem in combination. The first cmdlet retrieves the file lists based on various filters, and the second one does the computation. The PowerShell counterpart of the above example looks like this:

gci Downloads | measure -Property Length -sum

You have to pass the attribute that Measure-Object is supposed to use for the calculation with the Property parameter. Here, an object of the type System.IO.FileInfo will give you the file size via the property Length. You can omit the parameter in short notation:

gci Downloads | measure Length -s
Displaying the total file size with Measure Object

Displaying the total file size with Measure Object

As you can easily see in the result, the Count property contains the number of files. A big number behind Sum shows the total file sizes. Obviously, the unit is bytes, which you can convert to MB or GB:

(gci Downloads | measure Length -s).Sum /1GB

Converting file sizes to GB and processing the output with the format operator

The result will have quite a few decimal places. To round them to two, the format operator is useful:

"{0:N2} GB" -f ((gci Downloads | measure Length -s).Sum /1GB)

Of course, with the help of a filter you can limit the output to certain file types:

(gci Downloads *.iso | measure Length -s).Sum /1GB

This command tells you how much space ISO files occupy in the Downloads folder.

It gets more interesting if you want to calculate the size of all subdirectories:

gci -Dir -r | %{$_.FullName; ((gci -File $_.FullName | measure Length ‑Sum).Sum) /1MB }
Computing the size of subdirectories with gci and measure

Computing the size of subdirectories with gci and measure

Here, the first command collects the names of all folders in the current directory and pipes the result to the loop (alias "%"). The loop first displays the directory name and then calls Get-ChildItem again with the File parameter to pass the contents of the corresponding folder to Measure-Object.

By the way, to find only empty subdirectories, you don't need Measure-Object. You can just use the result of the GetFiles() method:

gci -Dir -r | %{ if($_.GetFiles().Length -eq "0") {$_.FullName}}

In addition to the sum, Measure-Object also processes other values such as the average, the minimum, and the maximum:

Subscribe to 4sysops newsletter!

gci -File | Measure-Object Length -Max -Min -Sum -Average

However, in the context of file sizes, these parameters are only of limited use. For instance, Measure-Object does not give you the name of the largest or smallest files. If you want to identify the biggest storage hogs, you will need Sort-Object.

avatar
20 Comments
  1. Wolfgang,

    These are great posts, but you are making one mistake that I think is pretty bad. You are using aliases and abbreviated cmdlet names in your code, which is perfectly acceptable for day to day usage but not a best practice when sharing code. I shorten my cmdlets all day long, but I don't shorten cmdlets or uses aliases when explaining to others or sharing scripts.

    For example, a new user getting started with Powershell will not know that % is shorthand for foreach-object. I wasted a full day chasing this alias down one day when I first started with PS because I didn't know about aliases or how to look them up.

    These are solid articles otherwise! Nice job.

    avatar
  2. Wolfgang Sommergut 4 years ago

    Mike,

    thank for your feedback, good point! I have assumed that these commands are being used interactively. In a script, aliases should not be used since you cannot be sure they do exist on a target machine. But you are right, from a didactic perspective I should explain an alias before using it first.

    Wolfgang

    avatar
  3. Mike, you are right, aliases shouldn't be used in scripts that you share. It is also best practice not to use aliases in scripts that you don't share even if the aliases are common and available everywhere because if you have to review your own script in a year or two, it will be much easier to read.

    I am also often using the cmdlet names in blog posts even in interactive scenarios because these commands are easier to understand for beginners. You usually can assume that the reader will never really type this command and just copy and paste it. 

    However, writing a blog post where you explain how certain tasks can be accomplished interactively is a completely scenario. The reason why aliases exist is because cmdlet names can be pretty long-winded. This improves readability but also means a lot of type-type compared to other popular shells such as bash. Tab completion can help here, but it will still slow you down significantly. I mean, who really types "Get-Childitem" on a console to view the contents of a folder?

    Thus, if you are teaching how to work on a console, aliases are the main players because this is what learners are supposed to actually use. Of course, you should mention the corresponding cmdlet names in the text and this is what Wolfgang is usually doing.

  4. Ian 4 years ago

    Really good explanation of how to calculate folder size. I am always having to find this out for other people so will now script it and let them run it when they want to. This is going to save me time. Thanks.

     

  5. Kim Vogel (Rank: 1)
    3 years ago

    Hi, I was just trying to get total numbers of files in each folder recursively. I tried

    $folders = get-childitem -recurse "mypath"
    $folders |
    foreach { $_.FullName; (get-childitem -File $_.FullName | Measure-Object) }

    It worked but then it also returned counts (always just 1) for each of the text files in each subfolder. Is there any way to just limit it to the numbers of files in the directories?

  6. JillJ 2 years ago

    None of the file-counts or file-sizes are correct using PowerShell.

    They *NEVER* match using File Explorer >> Right-click >> Properties.

    Not even close.

  7. @JillJ I would trust the Powershell numbers over the File Explorer numbers any day.  Explorer does all kinds of rounding, estimating, etc. 

    However, PS won't necessarily pick up hidden files, and system files, etc. But - it also doesn't pick up files you don't have access to.   So, a lot depends on exactly what you are looking for.  Another important thing is *how* you gather them. 

    @Kim Vogel: You just need a couple of minor changes..

    $folders = get-childitem -Directory -Path "somepath"
    $folders |
    ForEach-Object { 
            $Directory = $_.FullName
        [pscustomobject]@{
            Directory = $Directory
            FileCount = (Get-ChildItem -Path $Directory -Recurse -File).Count 
        }
    }
    
    
    # The output will look something like this:
    
    Directory               FileCount
    ---------               ---------
    C:\SomePath\SubDir1             3
    C:\SomePath\SubDir2             2
    C:\SomePath\SubDir3             4
    C:\SomePath\SubDir4             5

    You need to specify -directory on the first line to avoid picking up files.

    And using the [pscustomobject] lets you gather the info together and present it cleanly.  

    • N Srinivasan 6 months ago

      Can you please help me to get the count of filesize on top of this code?

      $folders = get-childitem -Directory -Path "somepath"
      
      $folders |
      
      ForEach-Object {
      
              $Directory = $_.FullName
      
          [pscustomobject]@{
      
              Directory = $Directory
      
              FileCount = (Get-ChildItem -Path $Directory -Recurse -File).Count
      
          }
      
      }
      
      # The output will look something like this:
      
       
      
      Directory               FileCount                     Size(in MB/GB)
      
      ---------               ---------                             -------------------
      
      C:\SomePath\SubDir1             3                      1024
      
      C:\SomePath\SubDir2             2                       222222222
      
      C:\SomePath\SubDir3             4                    67676355363
      
      C:\SomePath\SubDir4             5                   12345678967638

  8. Cornel H 2 years ago

    David, easy to understand and very well explained with examples. Thanks!

    How can i export the results from a [pscustomobject] to a log file ( ie ExportCs,etc. Export-Csv prompt me for an InputObject)

  9. Cornel H 2 years ago

    David, easy to understand and very well explained with examples. Thanks!

    How can i export the results from an [pscustomobject] to a log file ( ie ExportCs,etc. Export-Csv prompt me for an InputObject)

  10. You have it correct Cornel.. on line 9, you just add to it.. 

    $folders = get-childitem -Directory -Path "somepath"
    $folders |
    ForEach-Object { 
            $Directory = $_.FullName
        [pscustomobject]@{
            Directory = $Directory
            FileCount = (Get-ChildItem -Path $Directory -Recurse -File).Count 
        }
    } | Export-CSV -filepath <path> -NoTypeInformation

    David F. 

  11. Cornel H 2 years ago

    That worked great! Thank you.

  12. Larry Wakeman 2 years ago

    Hi, I just started using PowerShell today (though I'm a seasoned programmer).  What I'm trying to do is sum file sizes for specific file types, like *.msp.  Thanks!

  13. @Larry Wakeman,

    All you have to do is filter your Get-ChildItem, grab the length, and sum it up..

    Get-ChildItem -path  -recurse -file -filter *.msp | foreach-object {
    $TotalLength  = $_.Length
    }
    $TotalLength
    $TotalLength/1MB
    $TotalLength/1GB

    You can shorten it up and remove the -file and the -filter

    Get-ChildItem -Path .\*.psm1 -recurse | ForEach-Object { $TotalLength  = $_.length }

    or.. we can go overboard and turn it into a full blown function..

    function Get-FileSizeSum {
        [cmdletbinding()]
        param (
            [parameter(Mandatory, ValueFromPipelineByPropertyName)]
            [ValidateScript({Test-Path -path $_})]
            [string]$FilePath,
    
            [parameter(Mandatory, ValueFromPipelineByPropertyName)]
            [string]$Extension = '.msp',
    
            [parameter(ValueFromPipelineByPropertyName)]
            [ValidateSet('KB', 'MB', 'GB', 'Byte')]
            [string]$SizeDisplay = 'MB,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [switch]$Recurse
        )
        
        $Parameters = @{
            'FilePath' = $FilePath
            'Extension' = $Extension
        }
    
        if ($Recurse) {
            $null = $Parameters.Add('Recurse', $null)
        }
        Get-ChildItem @Parameters | ForEach-Object {
            $TotalSize  = $_.Length
        }
    
        if ($SizeDisplay -eq 'Byte') {
            $TotalSize
        }
        else {
            $TotalSize/$SizeDisplay
        }
    }

    I'm showing all kinds of things you *can* do.. 

    David F. 

  14. Jay Montana 2 years ago

    Hello,

    This is very helpful but and I am trying to adapt it to what I need but stuck.

    I need the run through a list of shares (UNC paths) and report the size of each share and how many files.  I need this for all folders and subfolders and files but I don't need a breakdown of the subfolders and files.  I do not need hidden files.

    So for example, these 3 shares.  I need import a csv list of shares / paths.

    \\Server1\UserShare1\Bob

    \\Server2\UserShare2\John

    \\Server3\UserShare3\Amy

     

    thanks!

  15. Are you on the inside or the outside? (i.e. calculating from the server itself, or from a workstation)

    You can just use the function I wrote out and use the extension as .*, and we can add a tracker to the function.. 

    function Get-FileSizeSum {
        [cmdletbinding()]
        param (
            [parameter(Mandatory, ValueFromPipelineByPropertyName)]
            [ValidateScript({Test-Path -path $_})]
            [string]$FilePath,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [string]$Extension,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [switch]$FilesOnly,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [ValidateSet('KB', 'MB', 'GB', 'Byte')]
            [string]$SizeDisplay = 'MB,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [switch]$Recurse
        )
        
        $Parameters = @{
            'FilePath' = $FilePath
            'Extension' = $Extension
        }
    
        if ($FilesOnly) {
            $null = $Parameters.Add('File', $null)
        }
    
        if ($Recurse) {
            $null = $Parameters.Add('Recurse', $null)
        }
    
        if ($Extension) {
            'Filter' = $Extension
        }
    
        $Files = Get-ChildItem @Parameters | Select-Object -Property Length
        $Files | ForEach-Object {
            $TotalSize  += $_.Length
        }
    
        if ($SizeDisplay -ne 'Byte') {
            $TotalSize = $TotalSize/$SizeDisplay
        }
    
        [pscustomobject]@{
            'Path'      = $FilePath
            'FileCount' = $Files.Count
            'TotalSize' = $TotalSize
        }
    }

    That adjusted version should get you what you are looking for.  Just feed in your list of paths, and that should get you the data 🙂

    David F.

  16. Om 1 year ago

    how should I feed the path? Please advise i'm a bit of novice.

  17. You would just call the function with the parameter of the path you wanted to measure.

    Something like:

    Get-FileSizeSum -FilePath c:\windows -FilesOnly -Recurse

    David F

  18. Tony 12 months ago

    Thanks for the article. I am trying to sum up 500,000 files approx 30Gb on a remote PC. Whilst it works it is very slow. Any suggestons for speeding it up?

  19. You can invoke it remotely, that will be considerably faster than trying to do it remotely.

    I worked up a new version around Windows Powershell & Powershell Core.  It is a *lot* faster, however, I can't seem to work around a bug in it.

    function Get-FileSizeSum {
        [cmdletbinding()]
        param (
            [parameter(Mandatory, ValueFromPipelineByPropertyName)]
            [ValidateScript({Test-Path -path $_})]
            [string]$FilePath,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [string]$SearchPattern = '*',
    
            [parameter(ValueFromPipelineByPropertyName)]
            [ValidateSet('MB', 'GB', 'TB', 'KB', 'Byte')]
            [string]$SizeDisplay = 'MB',
    
            [parameter(ValueFromPipelineByPropertyName)]
            [switch]$Recurse,
    
            [parameter(ValueFromPipelineByPropertyName)]
            [ValidateScript({
                    if ($PSVersionTable.PSVersion.Major -gt 5) {
                        $WindowsOnlyOptions = @(
                            'Archive',
                            'Compressed',
                            'Device',
                            'Directory',
                            'Encrypted',
                            'IntegrityStream',
                            'NoScrubData',
                            'NotContentIndexed',
                            'Offline',
                            'SparseFile',
                            'System',
                            'Temporary'
                        )
                        $UniversalOptions = @(
                            'Hidden',
                            'ReadOnly',
                            'ReparsePoint',
                            'Normal'
                        )
                        if (($IsMacOS -or $IsLinux ) -and $_ -in $WindowsOnlyOptions) {
                            Write-Warning -Message  ('Invalid FileAttribute chosen for this OS')
                        }
                    $true
                }
                else {
                    Throw 'This parameter is invalid for Windows Powershell, it is only for Powershell Core'
                }
            })]
            [System.IO.FileAttributes[]]$AttributesToSkip
        )
    
        $Path = Get-Item -path $FilePath
        if ($PSCmdlet.PSVersion.Major -gt 5) {
            $EnumerationOptions = [System.io.EnumerationOptions]::New()
            if ($Recurse) {
                $EnumerationOptions.Recurse = $true
            }
            if ($AttributesToSkip) {
                $EnumerationOptions.AttributesToSkip = $AttributesToSkip
            }
            $SearchOption = $EnumerationOptions
        }
        else {
            if ($Recurse) {
                $SearchOption = [System.io.SearchOption]::AllDirectories
            }
            else {
                $SearchOption = [System.IO.SearchOption]::TopDirectoryOnly
            }
        }
    
        Foreach($File in ($Path.EnumerateFiles($SearchPattern, $SearchOption) )) {
            $TotalLength += $File.Length
        }
        switch ($SizeDisplay) {
            'KB'    { [math]::Round($TotalLength/1KB, [System.MidpointRounding]::AwayFromZero); Break}
            'MB'    { [math]::Round($TotalLength/1MB, [System.MidpointRounding]::AwayFromZero); Break}
            'GB'    { [math]::Round($TotalLength/1GB, [System.MidpointRounding]::AwayFromZero); Break}
            'TB'    { [math]::Round($TotalLength/1TB, [System.MidpointRounding]::AwayFromZero); Break}
            Default { $TotalLength }
        }
        Write-Verbose -Message ('Size Display is in {0}' -f $SizeDisplay)
    }
    

    The bug is that it chokes if a subdirectory is inaccessible.  The default options for the .Net call to EnumerateFiles() in .Net Core is to ignore those errors.  However, it seems to still just choke and crash on it. 

    It is not 100% tested, but the initial tests work pretty well, and like I said it is substantially faster.  I'm going to do a little more research on this, because I'm deeply curious as to why and how to fix it 🙂

    Another major difference in this version - I'm not checking extensions. The .Net call uses a simple match search pattern: * = 1 or more matching characters at a position, ? = 0 or 1 matching character at a position. 

    David F. 

    avatar

Leave a reply

Please enclose code in pre tags

Your email address will not be published. Required fields are marked *

*

© 4sysops 2006 - 2021

CONTACT US

Please ask IT administration questions in the forums. Any other messages are welcome.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account