Sometimes speed and efficiency are not that important when writing quick and dirty PowerShell scripts. At other times they are critical. This article will highlight some of the common mistakes people make when writing PowerShell scripts that cause it to slow to a crawl.

PowerShell does a lot for us to make our scripting lives easier. This is one reason it's such a popular language in the Windows ecosystem. That flexibility and ease of use can sometimes come at a cost though.

How you write PowerShell can have a significant impact on performance. What seems innocuous when executed once can have a huge impact when executed hundreds or thousands of times in a loop or as part of a bigger script run against many machines.

Let's go over two of the most common reasons your PowerShell script may be running slow.

Arrays and the += assignment operator

It's very common to see a pattern like this in PowerShell. We define a result collection, loop over a large list of things, perform some type of process or calculation on it, and then add the finished item to the result collection. The benefit to this pattern is it is very easy to read and understand what is going on logically. Not all is as it seems though, and there are some serious performance penalties lurking in this pattern.

Let's run this again but wrap it with Measure-Command to see how long it takes.

Speed of appending to arrays

Speed of appending to arrays

Yikes! Performing this simple task took nearly four seconds. We know PowerShell can do better than that. Let's see why this took so long.

In PowerShell, and C# by the way, arrays are immutable, meaning they have a fixed size at creation time. When you use +=, you think you are appending to an existing array, but PowerShell is creating a new fixed-sized array with the values from the old array and the item you are adding. It then removes the old variable from memory. This is an expensive operation, both in terms of memory use and computation time. For small loops, you will probably not notice this performance hit. Only when performing this operation hundreds or thousands of times would you usually see this time and resource killer creep up.

.NET to the rescue

Fortunately, we have a set of .NET collection types we can use to speed this up. The most commonly used one with PowerShell is probably System.Collection.ArrayList. Internally, ArrayList uses a more efficient algorithm to increase the size of the list when needed. Here is how that same operation would look when using the ArrayList collection type.

You can see the code is nearly identical to the example above. Note that we're redirecting the output of the Add() method to $null. The Add() method returns the newly added items' index in the ArrayList. You may not want that output cluttering up your output stream, so we can redirect it to $null or capture it in a variable for later use.

Let's see how long this new method took.

Speed when using ArrayList

Speed when using ArrayList

Using System.Collection.ArrayList took about 224 milliseconds. That is quite a difference!

Note: To add type safety, you can also use the strongly typed System.Collection.Generic.List collection. This collection type has a slight speed improvement over ArrayList as well.


An alternative to using the .NET types is to assign the values output from the loop to a variable directly. PowerShell will collect all the items in the loop and assign them to $result as one operation.

Now let's see how long this operation took.

Inline array assignment

Inline array assignment

This one took about 232 milliseconds. That time is close to using ArrayList.

I encourage you to explore the System.Collections namespace here and read about the available collections here to learn more about the C# collection types.

Always filter left

Filtering data returned from cmdlets or functions is another common PowerShell pattern. Suppose we want to search the local event log for a certain event ID. A common approach you may see is something like this:

This is a slow operation since we're sending each event returned from Get-WinEvent down the pipeline and only then filtering with Where-Object. The pipeline will not finish until it processes every event log entry Get-WinEvent returns.

Let's wrap this in Measure-Command to see how long it takes.

Filtering options to the right

Filtering options to the right

This operation took 2.6 seconds. Let's try to speed that up.

A better approach is to filter left as much possible using the built-in parameters of many cmdlets to return only data that passes the filter. This is the "filter left, format right" saying you may have heard about in PowerShell circles. For Get-WinEvent, we'll use the -FilterHashtable parameter to pass in our filters rather than piping all items to Where-Object.

Filtering objects to the left

Filtering objects to the left

Interesting—this took .08 seconds vs. 2.61 seconds. In this example, -FilterHashtable is over 31 times faster!

As an exercise, try searching for other cmdlets and functions that have built-in filtering functionality by using Get-Command.

Subscribe to 4sysops newsletter!

As you can see, simply following these two performance optimizations can lead to dramatic speed increases.

  1. Adminny 5 years ago

    I did the same.
    The first command is 21-22 seconds.
    The Last command (FilterHashtable) is 0,097 seconds !

  2. Nice writing, Brandon. – Some more useful .net stuff I found during the last years can be found here:
    Useful .NET classes for PowerShell

    If you have more or better examples, feel free to edit the page. – It’s a wiki 🙂

  3. Kevin Bates 5 years ago

    Thanks! I actually knew the different methods to do this but was unaware how much difference there is.

    Nicely written, Cheers!

Leave a reply

Your email address will not be published. Required fields are marked *


© 4sysops 2006 - 2023


Please ask IT administration questions in the forums. Any other messages are welcome.


Log in with your credentials


Forgot your details?

Create Account