Tag Archives: Powershell

Robocopy Powershell Script for Home Back-up

In this post, I am going to share my backup script for my home data needs. First, I will explain the problem that my script addresses.

When I built my computer a little over a year ago, I decided I was going to do backup properly and purchased a licence for Acronis True Image Home 2012. This product has proved to be quite a disappointment. Acronis does a fine job (I guess) for backing up a system drive. So, I still use it to create images of my C drive. This is important because the operating system is running on an SSD. SSD’s are known to drop dead without providing any advanced warning. So having an image of your system drive is imperative, if you are running it on an SSD.

Data is a different story. The problem with an application like Acronis is that it doesn’t offer a way of mirroring a data drive. All of its backup types result in a container/image being created with a .tib extension. I want to be able to mirror my data and to be able to surgically pick and choose which directories get backed up.

So I started hunting around and came across this very cool script. In the event that that blog disappears, you can download the original script itself clicking the following button:

That was my starting point. I finessed the script a little to tailor it more to my needs and to make it more flexible. The script in its original form is hard-coded to 1 directory. So, if you had a swag of different directory-trees which you wanted to backup, you would have to create a separate version of that script for each of those directories (which is what I originally did). My amended version is more generalised and includes some input parameters, such that it can be re-used for various directory-trees. I also changed the flags which Robocopy is being called with to:

  • suit my goal of creating a mirror image of whatever directory I am backing up; and
  • to meet my logging needs.

My script is as follows:

## ================================================================
## Script name: MirrorDirectories.ps1
## ================================================================

## This Script Mirrors a directory tree from source to destination with the Windows builtin command robocopy.
## Exit codes from robocopy are logged to Windows Eventlog.
## Author: NIKLAS JUMLIN

## Usage: Run with administrative rights in Windows Task Scheduler or Administrator:PowerShell
## If not executed with administrative privileges the script will not write to eventlog.

## Amended by David Alan Rogers Esq. tailored for his devious designs!!!

## ================================================================
## Change these parameters as relevant
## ================================================================

Param([string]$dirToBackup, [string]$fullPathOfDirToBackup, [string]$jobName, [string]$nasShare, [string]$logDirectory)


## Name of the job, name of source in Windows Event Log and name of robocopy Logfile.
$JOB = $jobName + ($dirToBackup -replace "\s+", "")

## Source directory
$SOURCE = $fullPathOfDirToBackup

## Destination directory. Files in this directory will mirror the source directory. Extra files will be deleted! 
$DESTINATION = join-path -path $nasShare -childpath $dirToBackup
Write-Host "JOB: $JOB"
Write-Host "SOURCE: $SOURCE"
Write-Host "DESTINATION: $DESTINATION"

## Path to robocopy logfile
$LOGFILE = join-path -path $logDirectory -childpath $JOB
Write-Host "LOGFILE: $LOGFILE"
## Log events from the script to this location
$SCRIPTLOG = $LOGFILE + "-scriptlog.log"
Write-Host "SCRIPTLOG: $SCRIPTLOG"

## Mirror a directory tree. Equivalent to /e /purge
## /e		: Copies subdirectories. Note that this option includes empty directories. 
## /purge	: Deletes destination files and directories that no longer exist in the source. 
$WHAT = @("/MIR")

## /R:3		: Retry open files 3 times
## /W:5 	: wait 5 seconds between tries.
## /FFT 	: assume FAT file times (2 second granularity). Target folder is ext2/ext3, & those file systems also implement file times with 2 second granularity. NTFS does not assume that - http://www.luisrocha.net/2008/12/robocopy-error-error-5-0x00000005.html
## /Z		: ensures Robocopy can resume the transfer of a large file in mid-file instead of restarting.
## /XA:H	: makes Robocopy ignore hidden files, usually these will be system files that we're not interested in.
## /COPY:DT : turn off the attribute copying. /COPY:DAT copies file attributes and is default. Remove the A to prevent attributes being copied - http://www.luisrocha.net/2008/12/robocopy-error-error-5-0x00000005.html
## /NP      : no progress - don’t display % copied.
$OPTIONS = @("/R:3","/W:5","/FFT","/Z","/XA:H","/COPY:DT","/NP") 

## This will create a timestamp like yyyy-mm-yy
$TIMESTAMP = get-date -uformat "%Y-%m%-%d"

## This will get the time like HH:MM:SS
$TIME = get-date -uformat "%T"

## Append to robocopy logfile with timestamp
$ROBOCOPYLOG = "/LOG+:$LOGFILE`-Robocopy`-$TIMESTAMP.log"

## Wrap all above arguments
$cmdArgs = @("$SOURCE","$DESTINATION",$WHAT,$ROBOCOPYLOG,$OPTIONS)

## ================================================================

## Start the robocopy with above parameters and log errors in Windows Eventlog.
& C:\Windows\SysWOW64\Robocopy.exe @cmdArgs

## Get LastExitCode and store in variable
$ExitCode = $LastExitCode

Write-Host "ExitCode: $ExitCode"

$MSGType=@{
"16"="Errror"
"8"="Error"
"4"="Warning"
"2"="Information"
"1"="Information"
"0"="Information"
}

## Message descriptions for each ExitCode.
$MSG=@{
"16"="Serious error. robocopy did not copy any files.`n
Examine the output log: $LOGFILE`-Robocopy`-$TIMESTAMP.log"
"8"="Some files or directories could not be copied (copy errors occurred and the retry limit was exceeded).`n
Check these errors further: $LOGFILE`-Robocopy`-$TIMESTAMP.log"
"4"="Some Mismatched files or directories were detected.`n
Examine the output log: $LOGFILE`-Robocopy`-$TIMESTAMP.log.`
Housekeeping is probably necessary."
"2"="Some Extra files or directories were detected and removed in $DESTINATION.`n
Check the output log for details: $LOGFILE`-Robocopy`-$TIMESTAMP.log"
"1"="New files from $SOURCE copied to $DESTINATION.`n
Check the output log for details: $LOGFILE`-Robocopy`-$TIMESTAMP.log"
"0"="$SOURCE and $DESTINATION in sync. No files copied.`n
Check the output log for details: $LOGFILE`-Robocopy`-$TIMESTAMP.log"
}

## Function to see if running with administrator privileges
function Test-Administrator  
{  
    $user = [Security.Principal.WindowsIdentity]::GetCurrent();
    (New-Object Security.Principal.WindowsPrincipal $user).IsInRole([Security.Principal.WindowsBuiltinRole]::Administrator)  
}

## If running with administrator privileges
If (Test-Administrator -eq $True) {
	"Has administrator privileges"
	
	## Create EventLog Source if not already exists
	if ([System.Diagnostics.EventLog]::SourceExists("$JOB") -eq $false) {
	"Creating EventLog Source `"$JOB`""
    [System.Diagnostics.EventLog]::CreateEventSource("$JOB", "Application")
	}
	
	## Write known ExitCodes to EventLog
	if ($MSG."$ExitCode" -gt $null) {
		Write-EventLog -LogName Application -Source $JOB -EventID $ExitCode -EntryType $MSGType."$ExitCode" -Message $MSG."$ExitCode"
	}
	## Write unknown ExitCodes to EventLog
	else {
		Write-EventLog -LogName Application -Source $JOB -EventID $ExitCode -EntryType Warning -Message "Unknown ExitCode. EventID equals ExitCode"
	}
}
## If not running with administrator privileges
else {
	## Write to screen and logfile
	Add-content $SCRIPTLOG "$TIMESTAMP $TIME No administrator privileges" -PassThru
	Add-content $SCRIPTLOG "$TIMESTAMP $TIME Cannot write to EventLog" -PassThru
	
	## Write known ExitCodes to screen and logfile
	if ($MSG."$ExitCode" -gt $null) {
		Add-content $SCRIPTLOG "$TIMESTAMP $TIME Printing message to logfile:" -PassThru
		Add-content $SCRIPTLOG ($TIMESTAMP + ' ' + $TIME + ' ' + $MSG."$ExitCode") -PassThru
		Add-content $SCRIPTLOG "$TIMESTAMP $TIME ExitCode`=$ExitCode" -PassThru
	}
	## Write unknown ExitCodes to screen and logfile
	else {
		Add-content $SCRIPTLOG "$TIMESTAMP $TIME ExitCode`=$ExitCode (UNKNOWN)" -PassThru
	}
	Add-content $SCRIPTLOG ""
	Return
}

In order to use that script, I have created another script that feeds it the required parameters. That script is as follows:

## Common variables for the backup operations
$InvokedFrom = (Split-Path $MyInvocation.InvocationName)
$MirrorScriptPath = join-path -path $InvokedFrom -childpath MirrorDirectories.ps1
$JobName = "EDriveBakJob-"
$NasDirectoryForBaks = "\\BACKUPNAS\plaguisebak"
$logDirectory = "H:\TestThing"

## Begin to massage variables into the final string which will be executed
$constantVariables = [string]::Format(" -jobName '{0}' -nasShare '{1}' -logDirectory '{2}'", $JobName, $NasDirectoryForBaks, $logDirectory)
$scriptPlusFolderSpecificVariables = $MirrorScriptPath + " -dirToBackup '{0}' -fullPathOfDirToBackup '{1}'" + $constantVariables

## A function to provide the completely finished string to be executed
Function Get-Full-Line-To-Execute([string]$DirName, [string]$DirFullPath) {
    $returnString = [string]::Format($scriptPlusFolderSpecificVariables, $DirName, $DirFullPath)    
    return $returnString
}

## ******************************************** Backup Operations ********************************************
## E:\Documents
$DirectoryName = "Documents"
$DirectoryFullPath = "E:\Documents"
$ExePlusArgsDocuments = Get-Full-Line-To-Execute $DirectoryName $DirectoryFullPath
write-host $ExePlusArgsDocuments "`r`n"
invoke-expression -Command $ExePlusArgsDocuments

write-host "$DirectoryName directory done!`r`n"

## E:\Jeremia
$DirectoryName = "Jeremia"
$DirectoryFullPath = "E:\Jeremia"
$ExePlusArgsJeremia = Get-Full-Line-To-Execute $DirectoryName $DirectoryFullPath
write-host $ExePlusArgsJeremia "`r`n"
invoke-expression -Command $ExePlusArgsJeremia

write-host "$DirectoryName directory done!`r`n"

## E:\Mozilla
$DirectoryName = "Mozilla"
$DirectoryFullPath = "E:\Mozilla"
$ExePlusArgsMozilla = Get-Full-Line-To-Execute $DirectoryName $DirectoryFullPath
write-host $ExePlusArgsMozilla "`r`n"
invoke-expression -Command $ExePlusArgsMozilla

write-host "$DirectoryName directory done!`r`n"

As you can see, all I have to do to add a directory to the backup operation is to create another section under the area delineated by the Backup Operations comment.

A view comments about that calling script:

  1. I keep this script in the same directory as the MirrorDirectories.ps1 script. This can be changed, but you’ll have to set the $MirrorScriptPath variable to the full path of its location
  2. $JobName is set to whatever tickles your fancy
  3. $NasDirectoryForBaks is set to be overarching backup directory which will contain all of the directories which I backup
  4. $logDirectory will contain the logs which Robocopy writes out

To explain the paths a little more, the overarching directory will be something like \\BACKUPNAS\plaguisebak (in my environment that is a share on a QNAP NAS). Then, in each operation a target folder is specified, such that the full path will be the path to the overarching directory plus the target folder e.g. If I was backing up E:\Code, $NasDirectoryForBaks is set to \\BACKUPNAS\plaguisebak and $DirectoryFullPath (lower in the script) is set to E:\Code with the $DirectoryName variable set to Code. This will result in E:\Code being mirrored to \\BACKUPNAS\plaguisebak\Code. It is important to do that, because if you set the targets of each backup operation to \\BACKUPNAS\plaguisebak without any subfolder-target, each backup operation will delete and overwrite whatever is in \\BACKUPNAS\plaguisebak.

As Niklas notes in the blog post in which he explains his script, the interesting aspect of it is the fact that it writes messages to the Windows Event Log. If something goes wrong, I can look there and see what error code Robocopy exited with. Here, we can see that the Robocopy operation exited with a code of 1 and the path to the log file is displayed:
Backup Succeeded

In this case, there was a problem (the Sql Server Service was still running), and it exited in an error state with a code of 8:
Backup Failed

Warning

A quick warning about my script. When I say mirror, I mean mirror. So, if you delete a file/directory from the source, the next time you run the script, it will be removed from the destination (backup location on my SAN). If you do want to retain a copy of something for long-term backup but want to remove it from your day-to-day system, you just need to copy it from either location to a third backup location. This is not a common occurrence for me. But what it does mean is that before I delete something from my machine, I have a think about whether I want to store it elsewhere for long-term persistence.

Get my scripts:

Find Packages on Nuget Using the Package Manager Console

Another handy Nuget command for the Package Manager Module:
PM>Get-Package -Filter React -ListAvailable

Updating jQuery Using Nuget to a Version Less than 2.0

jQuery 2 is out and whilst I’m keen to embrace new things, a lot of 3rd party libraries have jQuery 1.* as a dependency. If you just update jQuery using Nuget, it brings down jQuery 2.*, which may not be what you want.

So, the easiest way to target a specific version is to using the following command with the Package Manager Console (I chose 1.10.1 as an example):
PM> Install-Package jQuery -Version 1.10.1

Find and Replace with Powershell

I recently had to perform a mundane task. Basically, I needed to trawl through about 50 sql text files and duplicate them, but replacing the string dev-server with sit-server. And then I had to do it all over again with prod-server.
I knew I could easily do this using the Linux find and sed commands together. But I’m starting to think Powershell these days and was curious to see what it had to offer for the task at hand.

Some quick Googling brought me to this post on the very topic. I opted for the script version of the command (as distinct from the shell version), which was:

$a = (gci | ? {$_.Attributes -ne "Directory"}); 
$a | % { cp $_ "$($_).bak";
(gc $_) -replace "dev-server","sit-server" | sc -path $_ }

In the first line, the script assigns to the variable $a all of the files in the directory (by excluding directories). gci is an alias to Get-Child-Item.
The 2nd line creates a backup of each sql file, before the operation is performed. Good, organised, cautious approach. Rollback is a buggar – when you can’t.
The third line performs the actual find and replace operation in each of the text files in the directory.

That tiny script bought me about 30 minutes. And a very boring 30 minutes it would have been!