The little script discussed in this post helps you to incrementally copy large folders with the assistance of the venerable xcopy tool and PowerShell.

Files can be copied with a couple of mouse clicks in File Explorer; however, if you have to copy hundreds or thousands of files, things get a bit more complicated.

I was recently assigned a very simple task: copy the sales department folders from an old file server to a new one. Drag and drop, yeah? With a few gigabytes of data and several thousands of files, drag and drop doesn’t look like a great idea. The biggest problem is not even the speed. What if the copy operation partly fails? There is no way of telling which files were copied and which were not.

It is also worth mentioning that one of the requirements was to build a tool that can be used by end users to repeat process. For instance, if somebody changed already-copied files while the rest were still copying, the tool needs to be able to run an incremental sync at some point.

First, I thought about utilizing robocopy. It is a very nice utility with a lot of options. However, it didn’t work for me because it spends too much time trying to retry copying failed files. Even with the minimal number of retries set, it wasn’t acceptable. Another problem was that there is no obvious way to log information about failed copies only. There is verbose logging, which is nice, but the last thing I needed were hundreds of thousands of lines to analyze.

I then decided to try the Copy-Item PowerShell cmdlet. This is a great tool, except it cannot always copy opened files and is not good with incremental copying.

Finally, I remembered the old, faithful xcopy utility that has always worked for me, even way before Microsoft rolled up PowerShell. However, it does not have the logging capabilities I needed. Thus, I decided to use xcopy in combination with PowerShell.

Below is the script I used for my copy task. I will explain how it works below.

<#
.SYNOPSIS
    Copies folder content from source location to destination including ACLs.    
.DESCRIPTION
    Copies folder content from source location to destination including ACLs. 
    If the files and folders in the destination location already exist, the script performs an incremental copy.
    Every item that was changed before the copy process started will be copied to the destination.
    The error log file is created and populated with errors that occurred during the copy process, if any. 
.PARAMETER Source
    Path to the folder of the network share that will be copied.
.PARAMETER Destination
    Path to the folder of the network share where the source content will be copied.  
.PARAMETER Log
    Path to the log file to save copy errors if any. If omitted, it will save logs to the c:\temp\copylog.txt file.
    
.EXAMPLE
    PS C:\Scripts\> .\Copy_Folders_and_subfilders_with_Error_Logging_Incrementaly.ps1 -Source \\windows10\share1 -Destination \\windows10\share2 -Log c:\temp\copylog.log
    This example copies files and folders from the \\windows10\share1 location to the \\windows10\share2 one and stores errors in the c:\temp\copylog.log file.
.NOTES
    Author: Alex Chaika
    Date:   May 17, 2016    
#>
[CmdletBinding()]
Param(
   [Parameter(Mandatory=$True)]
   [string]$Source,
   [Parameter(Mandatory=$True)]
   [String]$Destination,
   [Parameter(Mandatory=$False)]
   [String]$Log
)
if (!($Log)) {
    $Log = "c:\temp\copylog.txt"
}
$Error.Clear()
$Dt = Get-Date
New-Item -ItemType file -Path $Log -Force
"Starting incremental sync process for: $Source folder at $Dt" | Out-File $Log -Append
xcopy $Source $Destination /c /s /e /r /h /d /y /x
$Dt = Get-Date
"Incremental sync process for: $Source folder has completed at $Dt The following errors occurred:" | Out-File $Log -Append
" " | Out-File $Log -Append
"$Error" | Out-File $Log -Append

First, since I wasn’t going to use the script myself, I assigned parameter values in the script and had to give the end user some idea of how to use it. Thus, I wrote a help section with a description of what the script does. The Get-Help cmdlet is quite useful in such cases because it allows the end user to retrieve all the information required to successfully work with the script.

Users will get the output you see in the screenshot below if they run the Get-Help cmdlet against the script.

Using Get-Help to explain the usage of PowerShell script

Using Get-Help to explain the usage of PowerShell script

Using Get-Help to explain the usage of the PowerShell script

Now, let’s take a look at the parameters section.

[CmdletBinding()]
Param(
   [Parameter(Mandatory=$True)]
   [string]$Source,
   [Parameter(Mandatory=$True)]
   [String]$Destination,
   [Parameter(Mandatory=$False)]
   [String]$Log
)

The first line is the CmdletBinding() keyword, which makes the script operate like a compiled cmdlet. This is followed by two mandatory parameters, $Source (source folder) and $Destination (destination folder), which the script expects. The $Log parameter is not mandatory, because a default value is assigned in the script:

if (!($Log)) {
    $Log = "c:\temp\copylog.txt"
}
$Error.Clear()

The line above is just a precaution. I use the $Error PowerShell object to log process errors, and I have to be sure that there are no old errors in the log file that are not related to the current operation.

$Dt = Get-Date

Here, we store the current date in the $Dt variable.

New-Item -ItemType file -Path $Log –Force

This command creates the log file. Each time the script runs, it recreates the log file.

"Starting incremental sync process for: $Source folder at $Dt" | Out-File $Log -Append

This line writes the source folder and the date and time into the log file.

xcopy $Source $Destination /c /s /e /r /h /d /y /x

As you can see, I use quite a few xcopy parameters. However, I won’t explain them here; extensive documentation is available.

$Dt = Get-Date

In the next line, I refresh the date and time information.

"Incremental sync process for: $Source folder has completed at $Dt The following errors occurred:" | Out-File $Log -Append
" " | Out-File $Log -Append

This command updates the log file with the time of completion.

Subscribe to 4sysops newsletter!

"$Error" | Out-File $Log -Append

Here, I write the contents of the $Error object into the log file. If any errors occurred during the file copy process, all of them will be now be in the log file.

13 Comments
  1. Albert 6 years ago

    Hi Alex,

    Many thanks for sharing the tips.

    Is it possible to send the email notification once the copy has finished along with the log result ?

  2. Radu 6 years ago

    robocopy does a better job.

    • Author
      Alex Chaika 6 years ago

      Hi Radu,
      robocopy certainly does it’s job. What robocopy doesn’t have and PowerShell does, is the flexibility. And the last comment from Albert demonstrates that very well: you need an email after the copy is done? Here you go! This is the the beauty of script languages.

  3. Author
    Alex Chaika 6 years ago

    Hi Albert,

    you welcome!

    Yes it is possible. Just use the Send-MailMessage PowerShell cmdlet. It is very simple:

    Send-MailMessage -From “user@yourdomain.com” -To “admin@yourdomain.com” -Subject “Copy is completed” -Body “Please find error log enclosed” -Attachments $Log -SmtpServer “yoursmtpserver.local” -DeliveryNotificationOption OnFailure

  4. Caleb 6 years ago

    Why not use robocopy with poweershell?  Xcopy is antequates and doesnt have multihreading abilities.  The one thing ribocopy needs is the ability to log only errors as you mentioned.

    • Author
      Alex Chaika 6 years ago

      Hi Caleb,

      you certainly can use robocopy with powershell. This is just a preference question. And as you can see I’m not writing scripts to describe them in the blog, I’m rather just sharing my experience with some tools. If it will work for some people – good. If someone wants to improve it – the last thing I would do is object it.

  5. Caleb 6 years ago

    Robocopy has the following switches which will eliminate any wait or retry time /w:0 /r:0 which is one reason you mentioned, and xcopy has the same logging problem you mentioned robocopy has. Whey did you prefer xcopy over robocopy when it has many more options and is more powerful? I would actually love to have a powershell/robocopy script that logged only errors but don’t have the powershell skills to do it yet.

    • Author
      Alex Chaika 6 years ago

      Unfortunately even with the mentioned switches robocopy still tried to copy the failed file again. So the reason I preferred xcopy is simple: it worked.

      You can just take my script, replace xcopy with robocopy and preferred switches and play with it.  It is pretty simple script, so it could be a good practice if you want to get to the PowerShell at some point of time.

  6. Gyz 6 years ago
    • Author
      Alex Chaika 6 years ago

      Well I’m glad somebody did a good job creating PowerShell cmdlet with robocopy functions. The problem with the third part cmdlets is that you need to add them manually. So in my case since I was passing my script to another team, I’m need to firstly add the cmdlet on all machines where they were going to run it. Another problem with this particular cmdlet that this is alfa version. The issues with unstable software is the last thing I’d like to deal with in production.

  7. rude naggar 6 years ago

    Robocopy replaces XCopy in the newer versions of windows

    Uses Mirroring, XCopy does not
    Has a /RH option to allow a set time for the copy to run
    Has a /MON:n option to check differences in files
    Copies over more file attributes than XCopy

    The most important difference is that robocopy will (usually) retry when an error occurs, while xcopy will not. In most cases, that makes robocopy far more suitable for use in a script.

    If you have XP or Windows Server you can easily get this in the Resource Kits. If you have Vista, it’s already in your path. That’s always nice. It’s Robust, indeed (hence, Robocopy) but it’s legendarily unforgiving. If anything is wrong with the command line options you’ll just get the help. It’s so hard to use there’s even a GUI Frontend you can get. However, when I want to get a directory from here to over there, I just do this (no wildcards allowed! Doh!) and it just gets there, auto skipping files that are already at the destination. It’s also wonderful over an unreliable network:

    robocopy "H:\Source" "z:\Dest" /S /Z

    Where /s means subdirectories, and /z means in restartable mode.

    Nag

  8. Sam 4 years ago

    I suggest using Start-BitsTransfer  utility when you do file copies, this is a background job that I can resume if a restart of server happened. More efficient way to do copying.

    $SourcePath = ‘U:\BACKUPS\FULL\’;
    $DestPath =’\\Servername\FULL_BKUP$\’;
    $LastBKUP = Get-ChildItem -Path $SourcePath | sort-object lastwritetime -Descending | select -First 12 # in this case I am getting the most 12 recent files
    # $LastBKUP = Get-ChildItem -Path $SourcePath | sort-object lastwritetime -Descending | Where{$_.LastWriteTime -gt (Get-Date).AddDays(-15)}
    foreach($i in $LastBKUP)
    {
    $FullDestinationPath =””;
    $FullDestinationPath = $DestPath +$i;

    $fullSourcePath=””;
    $fullSourcePath = $SourcePath +$i;

    Start-BitsTransfer -Source $fullSourcePath -Destination $FullDestinationPath

    }

     

  9. Lina 1 year ago

    Sounds good and helpful, altrenitavly I use Gs Richcopy 360, it us a GUI based and very quick 

Leave a reply

Your email address will not be published.

*

© 4sysops 2006 - 2022

CONTACT US

Please ask IT administration questions in the forums. Any other messages are welcome.

Sending

Log in with your credentials

or    

Forgot your details?

Create Account