- Pip install Boto3 - Thu, Mar 24 2022
- Install Boto3 (AWS SDK for Python) in Visual Studio Code (VS Code) on Windows - Wed, Feb 23 2022
- Automatically mount an NVMe EBS volume in an EC2 Linux instance using fstab - Mon, Feb 21 2022
Download with SMB
If you are working in a hybrid IT environment, you often need to download or upload files from or to the cloud in your PowerShell scripts. If you only use Windows servers that communicate through the Server Message Block (SMB) protocol, you can simply use the Copy-Item cmdlet to copy the file from a network share:
Copy-Item -Source \\server\share\file -Destination C:\path\
This assumes that you have a VPN solution in place so that your cloud network virtually belongs to your intranet. Things get a bit more complicated if we are leaving the intranet and have to download from an extranet or the Internet.
Download in PowerShell 2
The next simple case is where you have to download a file from the web or from an FTP server. In PowerShell 2, you had to use the New-Object cmdlet for this purpose:
$WebClient = New-Object System.Net.WebClient $WebClient.DownloadFile("https://www.contoso.com/file","C:\path\file")
As of PowerShell 3, we have the Invoke-WebRequest cmdlet, which is more convenient to work with. It is PowerShell’s counterpart to GNU wget, a popular tool in the Linux world, which is probably the reason Microsoft decided to use its name as an alias for Invoke-WebRequest. This is perhaps an understatement; Invoke-WebRequest is more powerful than wget because it allows you to not only download files but also parse them. But this is a topic for another post.
Download with Invoke-WebRequest
To simply download a file through HTTP, you can use this command:
Invoke-WebRequest -Uri "http://www.contoso.com" -OutFile "C:\path\file"
In the example, we just download the HTML page that the web server at www.contoso.com generates. Note that, if you only specify the folder without the file name, as you can do with Copy-Item, PowerShell will error:
Invoke-WebRequest : Could not find a part of the path
The shorter version for the command line is:
wget "http://www.contoso.com" -outfile "file"
If you omit the local path to the folder, Invoke-WebRequest will just use your current folder. The -Outfile parameter is always required if you want to save the file. The reason is that, by default, Invoke-WebRequest sends the downloaded file to the pipeline.
However, the pipeline will then not just contain the contents of the file. Instead, you will find an object with a variety of properties and methods that allow you to analyze text files. If you send a binary file through the pipeline, PowerShell will treat it as a text file and you won’t be able to use the data in the file.
To only read the contents of the text file, we need to read the Content property of the object in the pipeline:
Invoke-WebRequest "http://www.contoso.com" | Select-Object -ExpandProperty Content | Out-File "file"
This command does the same thing as the previous one. The -ExpandProperty parameter ensures that the header (in this case, “Content”) won’t be stored in the file.
If you want to have the file in the pipeline and store it locally, you have to use -PassThru parameter:
Invoke-WebRequest "http://www.contoso.com" -OutFile "file" -PassThru | Select-Object -ExpandProperty Content
This command stores the web page in a file and displays the HTML code.
Download and display file
Authenticating at a web server
If the web server requires authentication, you have to use the -Credential parameter:
Invoke-WebRequest -Uri https://www.contoso.com/ -OutFile C:"\path\file" -Credential "yourUserName"
Note that, if you omit the -Credential parameter, PowerShell will not prompt you for a user name and password and will throw this error:
Invoke-WebRequest : Authorization Required
You have to at least pass the user name with the -Credential parameter. PowerShell will then ask for the password. If you want to avoid a dialog window in your script, you can store the credentials in a PSCredential object:
$Credentials = Get-Credential Invoke-WebRequest -Uri "https://www.contoso.com" -OutFile "C:\path\file" -Credential $Credentials
You can use the -UseDefaultCredentials parameter instead of the -Credential parameter if you want to use the credentials of the current user. To add a little extra security, you might want to encrypt the password. Make sure to always use HTTPS instead of HTTP if you have to authenticate on a remote server. If the web server uses basic authentication, your password will be transmitted in clear text if you download via HTTP.
Note that this method only works if the web server manages authentication. Nowadays, most websites use the features of a content management system (CMS) to authenticate users. Usually, you then have to fill out an HTML form. I will explain in one of my next posts how you can do this with Invoke-WebRequest.
Downloading files through FTP works analogous to HTTP. You also shouldn’t use this protocol if security matters. To download multiple files securely, you had better work with SFTP or SCP. Invoke-WebRequest doesn’t support these protocols. However, third-party PowerShell modules exist that step into the breach.
In my next post I will show you can use Invoke-WebRequest to parse HTML pages and scrape content from websites.
Join the 4sysops PowerShell group!
Your question was not answered? Ask in the PowerShell forum!
Hi Michael – great article.
I am running a script on a scheduled basis (daily) to download a .csv file. However the uri changes every month, so I was wondering if the uri destination value can be set based on a value in a reference file as opposed to hard coding it, if so how?
You can store the URI in a text file and then read it in your script with Get-Content.
Thanks Michael – worked perfectly!
i am downloading a zip file from a website using the PowerShell, however the issue is that i have to filter by date to download that zip file.
Is the date on the website? Then use Invoke-WebRequest to read and then adapt the script to get the right URL of the zip. Will take some coding.
Great tips, can you tell me how you would apply this same concept in powershell to download all files from a web folder? Thank you in advance.
Im not sure whether this is possible. You would somehow need to enumerate the content of the folder and then download it. That is normally forbidden by webservers.
If you have a webserver where directory browsing is allowed, I guess you could use invoke-webrequest/invoke-restmethod to that folder which would list available files. Then you could parse the output and ask for specific files to be downloaded (or all of them). But I dont see any straight-forward way.
In case you don't want to specify the destination filename:
If filename is part of the url, you could use
$filename = [System.Uri]::UnescapeDataString((Split-Path -Leaf $strDownloadURL))
(e.g. "Test%20123.pdf" –> "Test 123.pdf")
If it's not: Use function "Get-RedirectedUrl" from https://stackoverflow.com/questions/25125818/powershell-invoke-webrequest-how-to-automatically-use-original-file-name/25127597#25127597
I'm trying to get a list of log file paths for a cleanup script.
This works fine but I cannot step through this content. When I put this content through a foreach loop it dumps every line at once. If I save it to a file then I can use System.IO.File::ReadLines to steps through line by line but that only works if I download the file. How can I accomplish this without downloading the file?
You can’t parse text files with Invoke-WebRequest. If the text file is unstructured you can parse it with regex. More information about using regex in PowerShell can be found here and here.
WebRequest gives memory errors when trying to use it to download large files.
Hello, I am having an issue with webrequest and downloading files.
I am trying to download files from a site, sadly they are be generated to include the Epoch Unix timestamp in the file name.
Also all the files are being kept in single folder, such as Upload or Result.
Now as I am unable to replace the Epoch Unix timestamp portion of the file name with a wild card, I was wondering if there was a way to do the download based on the date modified field of the file ?
Thanks for this. I plan to use this along with task scheduler to download a fresh file every week. However, the new file overwrites the older one. Is there a way to preserve the older file as well?
@Ken – You should be able to relatively easily – however, you’d have to download it first, since you can’t get the file properties until you download it. You could download it to a temp location, grab the LastWriteTime stamp and parse it to create your new name.
@Sumit – You have a similar situation. Download your file to a temporary location, and then copy/rename it with a timestamp in the name, and then you’ll keep a running list. You’ll also need to manage the old copies so you don’t fill up your disk. 🙂
hi, I’m a bit new to powershell and trying to figure out if it is possible to add either a powershell script or a function inside a powershell script to your path. Basically at the moment I have a problem and have to open powershell in the directory where the powershell script is, or navigate to the directory. From there I then have to execute my script. I want to just be able to execute script from anywhere on the shell without messing about with specifying the path to the tool, is than even possible?
Marta, you can append the PATH environment variable. The PATH variable determines where the OS searches for executables.
You could change $PATH in your PowerShell profile to make it permanent:
What you should probably do is just either put the scripts you want in your $profile script or just dot source them from there. Having them in your path doesn’t necessarily do much for you.
If you just dot source them from your profile, then you can easily call your functions from anywhere..
David, that works if you just have a few tiny scripts. However, in general you will want to organize your scripts in folders. Why do you think that working with the path variable won’t work?
Hey. I am trying to write a script to automate file download in PowerShell from Microsoft website:
But all functions are failing to download… any idea?
I’ve written a smile script to download a test file from my github and it works fine on my laptop powershell, however it throws below error when i run the same script on Windows Server 2012. The only thing i noticed is that version of Powershell is different only Laptop and the Server.
Exception calling “DownloadFile” with “2” argument(s): “An exception occurred during a WebClient request.”
At line:3 char:1
+ $WebClient.DownloadFile(“$Link”,”$env:C:\Test_powershell\BackupCI.xls …
+ CategoryInfo : NotSpecified: (:) , MethodInvocationException
+ FullyQualifiedErrorId : WebException
any suggestions/help on this is highly appreciated
But I need help :
I would like to recover a file with a .tok extension in PS.
I try with 2 different methods : Invoke-WebRequest and Invoke-RestMethod
And It is not OK.
Today, my only solution is to convert the .toc file in .txt file.
But I want to keep the original extension.
Do you have an idea?
For exemple with Invoke-WebRequest :
Looks like Test.tok just doesn’t exist on the server.
The file is on the server.
Did you ever work with this kind of file ?
Maybe the web server is configured not to serve this file type. Try to open the URL in a web browser and see it what happens. Invoke-WebRequest is doing the same as any browser.
I cannot paste a picture but when I open the url in a web browser, I see my 2 files .txt and .tok.
Just tried it with a file with a tok extension and it worked. Maybe you try with another web server.