Feb 222011
 

If you’ve been following my blog, you’ll know that I’ve written quite a few posts about the wonders of cloud backups and why you should be doing them to protect your precious data and memories. This doesn’t just apply to personal computer use – businesses should be taking advantage of offsite backups as well. Large businesses typically have this under control, but I’ve found that small businesses (especially those without an IT staff) tend to be the most vulnerable. Owners of many small businesses know just enough technology to do what they need, and rarely have the time or desire to keep up with changing times and best practices. While I don’t hold this against anyone, it is rather unfortunate.

Fire Alarm BoxA few months ago, a large bridal shop near me burned to the ground. This shop was of particular significance because Michelle ordered her wedding dress from there. We were very fortunate that her dress had not yet been made, so it wasn’t at the shop. Many others weren’t nearly as lucky. The business was a total loss, and with it went thousands of dresses, many of which were being held for upcoming weddings. The fire was on a Wednesday and many brides were scheduled to pick up their dresses that night for weddings taking place that weekend. Some literally arrived to pick up their dress and saw the building in flames. Through the generosity of other bridal shops in the area, I believe everyone was able to find *a* dress for their wedding, though obviously not the one they had ordered.

So what does this have to do with cloud backups? Nothing. The cloud can do many things, but putting out fires and replacing burned wedding dresses is not among them. Since there was plenty of time until our wedding, Michelle decided to wait a few weeks for things to die down and the business to set up a temporary space before calling and seeing what the deal was with her dress. We were assured that the dress order (placed several months earlier) was at the factory and we had nothing to worry about regarding its delivery in time for our wedding. That being said, they also informed us that they had lost absolutely all of their data, and were asking that we please fax or email over any paperwork we had including order forms and receipts.

This was not an issue for us at all, as we’ve been keeping all of our wedding-related info (including scanned forms) in Google Docs and sharing it between our accounts. This has been incredibly helpful, and I’ll have to blog about it all sometime in the future. We were able to email them everything we had within the hour. A little while later I got to thinking about the tremendous loss this business just incurred. Not only did they lose their inventory and their building, but all their customer and order data as well. Anyone whose name was on a list to get called back about something probably never heard from them. We were honest and sent back the forms reflecting how much money we still owed for the dress, but if they truly lost everything (and we didn’t have a conscience) we could have very easily doctored that receipt to say that everything was paid in full.

Once again, the cloud couldn’t have prevented the fire and I feel terrible for all the brides who were thrown a curve ball at the last minute and couldn’t get married in the dress of their dreams, but to me it’s equally sad that all the data loss that followed could have been prevented for a few dollars a month. Disasters like this are exactly what cloud backup can prevent. To all the small business owners out there: Your business is your data – treat your data like your job depends on it!

Feb 082011
 

T-SQL TuesdayThis Month’s T-SQL Tuesday is being hosted by Pat Wright (blog | twitter) and Pat is asking about tips and tricks for making life easier through automation using either T-SQL or PowerShell.

Recently I have been doing more and more work in PowerShell in order to get familiar with it. So far I’ve been quite happy. Batch scripting always seemed rather clumsy to me, and PowerShell represents a much-needed improvement, especially coming from a .Net programming background like I do.

One of the processes I’ve found particularly helpful to automate in PowerShell is the testing of database backups. A backup is useless if you can’t restore from it, so it’s an excellent idea to make sure you can successfully restore from your backups on a regular basis. Larger databases can take quite a while to restore, and doing it manually can get to be a real pain. Automating the process means it can happen at any time without human intervention, and all I need to do is check my email to determine if the restore succeeded or failed.

How It Works

While the code behind all this (which I’ll get to in a minute) may appear complex, what’s actually happening is pretty simple. The process is built with a few assumptions:
– You are testing a full backup
– Backups are located on a network share that’s easily accessible

The first step is to locate the appropriate backup on the network share and copy it to a temporary folder on the local machine. From there, the database is restored to the local machine with the name specified in the config file. The locations of where to restore the data & transaction log files can also be configured. Finally a DBCC CHECKDB is optionally run once the database is restored, and an email is sent notifying the operator(s) of the outcome and including a log of how the job ran.

The Files

This automated process has two parts: the PowerShell script and an XML config file. I’m a big fan of config files, as they allow me to use the same script on many different systems with different settings. I find this to be much better than the alternative of having lots of slightly different versions of the same script floating around.

The configuration file contains one or more “jobs”, each of which is geared toward restoring a specific database. With all of the configuration settings for a specific database packaged in a job, the number of arguments needed to execute the script is dramatically reduced. There is also a list of email addresses to be alerted upon the success/failure of the script. The configuration file I have included for download is heavily commented to explain the meaning of each setting. I always take the time to write XSDs for my configuration files in hopes of minimizing confusion about their structure – this is included as well.

The PowerShell script itself takes 2 arguments: the name of the job, and the location of the configuration file containing that job. From there, the script reads the configuration file and carries out all the appropriate actions. It requires two modifications before running on your system – the location of the SMTP server that will be sending out notification emails, and an email address it will list as being the sender of said emails.

As with all the scripts you’ll find on my site, these files are provided as-is and with no warranty. It is highly recommended that you read and understand what they are doing before using them at your own risk. Never deploy them to a production system without extensive testing in a development environment.

You can download all the files here. I’ve also posted the PowerShell script below for your viewing pleasure.

Good luck and happy restores!

$ErrorActionPreference = 'Stop';
$global:LOG = "";

# this just helps for printing errors to console
Function PrintUpdate($str) {
	$s = (Get-Date -format HH:mm:ss) + " $str";
	if($host.Name -eq "ConsoleHost") {
		Write-Host $s;
	} else { # no host is present in SQL Agent
		Write-Output $s;
	}
	$global:LOG = $global:LOG + "$s `r`n";
}

$USAGE = "USAGE: AutoRestore.ps1 <JobName> <ConfigFileName>";

#Parameters
# #1 - Job Name
# #2 - XML File

try {

	#check for required args
	if($args.Length -ne 2) {
		throw $USAGE;
	}

	# get job name
	$jobName = $args[0];

	# load config file for the job
	$configXML = [xml](gc $args[1]);
	$jobXpath = "Job[@name='" + $jobName + "']";
	$jobConfig = $configXML.DocumentElement.SelectSingleNode($jobXpath);
	$emailList = $configXML.DocumentElement.SelectSingleNode("EmailAlertList").InnerText;

	if($jobConfig -eq $null) {
		throw "Cannot locate job '$jobName'";
	}

	PrintUpdate("Starting Job '$jobName'");

	# load variables
	$folderContainingBackups = $jobConfig.SelectSingleNode("BackupStorageFolder").InnerText;
	$backupFileNamePrefix = $jobConfig.SelectSingleNode("BackupFile/@Prefix").Value;
	$localTempFolder = $jobConfig.SelectSingleNode("LocalTempFolder").InnerText;
	$dbToRestoreTo = $jobConfig.SelectSingleNode("DBToRestoreTo").InnerText;
	$folderToRestoreDataTo = $jobConfig.SelectSingleNode("FolderToRestoreDataTo").InnerText;
	$folderToRestoreLogTo = $jobConfig.SelectSingleNode("FolderToRestoreLogTo").InnerText;
	$runConsistencyCheck = $jobConfig.SelectSingleNode("RunConsistencyCheck").InnerText;

	# validate variables
	if(!(Test-Path $folderContainingBackups)) {
		throw "Folder Containing Backups ""$folderContainingBackups"" does not exist!";
	}

	if(!(Test-Path $localTempFolder)) {
		throw "Local Temp Folder ""$localTempFolder"" does not exist!";
	}

	if(!(Test-Path $folderToRestoreDataTo)) {
		throw "Folder To Restore Data To ""$folderToRestoreDataTo"" does not exist!";
	}

	if(!(Test-Path $folderToRestoreLogTo)) {
		throw "Folder To Restore Log To ""$folderToRestoreLogTo"" does not exist!";
	}

	if(!($runConsistencyCheck -match "^0$|^1$")) {
		throw "Invalid RunConsistencyCheck Value";
	}

	# find today's backup file (assumes it was done earlier today)
	$backupFileFullName = dir $folderContainingBackups -name -filter ($backupFileNamePrefix + "_Full_" + (Get-Date -format yyyyMMdd) + "*.bak");

	if($backupFileFullName -eq $null) {
		throw "Backup file not found!";
	}
	PrintUpdate("Found $backupFileFullName");

	# copy backup file to local folder
	PrintUpdate("Copying $backupFileFullName to $localTempFolder");
	copy-item ($folderContainingBackups + "\" + $backupFileFullName) $localTempFolder;

	# drop destination db if it exists
	[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null;
	[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null;

	$server = New-Object Microsoft.SqlServer.Management.Smo.Server("(local)");
	$server.ConnectionContext.StatementTimeout = 12000;
	$db = $server.Databases[$dbToRestoreTo];

	if($db -ne $null) {
		PrintUpdate("Dropping DB '$dbToRestoreTo'");
		$server.KillDatabase($dbToRestoreTo); # kills all connections before dropping
	}

	# restore db from backup
	PrintUpdate("Restoring backup To '$dbToRestoreTo'");
	$restoreFileName = ($localTempFolder + "\" + $backupFileFullName);
	$restoreDevice = new-object Microsoft.SqlServer.Management.smo.BackupDeviceItem $restoreFileName, File;
	$smoRestore = new-object Microsoft.SqlServer.Management.Smo.Restore;
	$smoRestore.Devices.Add($restoreDevice);

	# need to set relocatefiles since machine we're restoring to may not have same drive configuration as production
	$dt = $smoRestore.ReadFileList($server);

	if($folderToRestoreDataTo -eq $null) {
		$dataFileLoc = $server.DefaultFile;
		if($dataFileLoc -eq $null -or $dataFileLoc -eq "") {
			$dataFileLoc = $server.MasterDBPath;
		}
	} else {
		$dataFileLoc = $folderToRestoreDataTo;
	}

	if($folderToRestoreLogTo -eq $null) {
		$logFileLoc = $server.DefaultLog;
		if($logFileLoc -eq $null -or $logFileLoc -eq "") {
			$logFileLoc = $server.MasterDBLogPath;
		}
	} else {
		$logFileLoc = $folderToRestoreLogTo;
	}

	foreach($r in $dt.Rows) {
		$relocateFile = new-object Microsoft.sqlserver.management.smo.relocatefile;
		$relocateFile.LogicalFileName = $r["LogicalName"];
		$oldFilePath = $r["PhysicalName"];
		$n = $oldFilePath.lastindexof("\") + 1;
		if($r["Type"] -eq "D") { #data files
			$relocateFile.PhysicalFileName = $dataFileLoc + "\" + $dbToRestoreTo + "_" + $oldFilePath.substring($n);
		} else { # log file
			$relocateFile.PhysicalFileName = $logFileLoc + "\" + $dbToRestoreTo + "_" + $oldFilePath.substring($n);
		}
		[void]$smoRestore.RelocateFiles.Add($relocateFile);
	}

	# finally execute the restore function
	$smoRestore.Database = $dbToRestoreTo;
	$smoRestore.SqlRestore($server);

	# run DBCC CHECKDB
	if($runConsistencyCheck -eq "1") {
		PrintUpdate("Running consistency check on '$dbToRestoreTo'");
		$db = $server.Databases[$dbToRestoreTo];
		$db.CheckTables("None");
	}

	PrintUpdate("Job Complete");
	$emailMsg = "completed successfully";

} catch {
	$emailMsg = "encountered errors";
	PrintUpdate("`r`n***ERROR***  " + $_.Exception.ToString());
} finally {
	if($emailList -ne "" -AND $emailList -ne $null) {
		$emailFrom = "sender_email_address";
		$subject = "Job '$jobName' $emailMsg";
		$body = $global:LOG;
		$smtpServer = "smtp_server_name";
		$smtp = new-object Net.Mail.SmtpClient($smtpServer);
		$smtp.Send($emailFrom, $emailList, $subject, $body);
	}
}

Feb 032011
 

A while back I did a few posts covering my favorite cloud backup solutions, and one of my favorites was Mozy. That very well may change now that Mozy has announced they are changing their pricing structure. They claim that “the backup market has changed” since 2006 due to people taking more photos and videos than ever before, and even though the majority of their users back up 50GB or less, the few that greatly exceed that number are ruining it for everyone. Gone are Mozy’s days of backing up unlimited data for a flat rate.

Instead of $4.95 per month per computer for unlimited data backup, Mozy is now charging $5.99 per month for 50GB of backup space for 1 computer, or $9.99 per month for 125GB of space shared between up to 3 computers. Need more space? You can add to the $9.99 plan in increments of 20GB for $2 per month, and additional computers can be added for that same monthly rate. As before, there are discounts if you pre-pay for 1 or 2 years.

MoneyI wasn’t thrilled about how Mozy is giving its user base a “one-two punch” of raising prices and reducing value, and judging by some of the comments over at the Mozy Community Discussion Boards it looks like I’m not alone. Shouldn’t storage only be getting cheaper with time? I understand that enterprise-class storage isn’t exactly as simple as picking up a bunch of hard drives from your local geek store, but disk space in general is a lot cheaper than it used to be.

In defense of Mozy, they still are cheaper than using cloud storage such as Amazon S3. Mozy’s giving users 125GB for $9.99 a month. As of S3’s rates today, you’d pay $.14 per GB/Month or $17.50 and that’s just for the storage (don’t forget they also charge for transferring the data to them). On the other hand, there are many competitors who still offer unlimited cloud backup for a flat rate, and I’d imagine they are seeing a lot of new business from disgruntled Mozy users. Some of them, such as CrashPlan and Backblaze are even offering a discount for those who switch from Mozy.

Another option for cloud backup that has the potential to be awesome is Google Paid Storage. Their prices are amazing – currently $0.25 per GB per year, and they’ll sell you up to 16TB of space! The downside is that there’s no easy way to back up to this space (sometimes I wonder if this is on purpose). You can utilize their space by uploading files to Google Docs, however that’s neither fast nor convenient. People have been hoping for a Google online storage service (usually referred to as “G-Drive”) for a while now and there’s still no sign of it coming, but online backups would be another great way for them to make a killing.

Even with this recent price-jacking, are online backups still worth it? Absolutely. If your computer is stolen or goes up in smoke you will have no trouble replacing the hardware, installed applications, or music, but what you really need to protect are the things that can’t be replaced such as photos and videos. Things like that are priceless, and the best way to protect them is to keep a copy somewhere far far away, like the cloud. Backing up to an external hard drive is fine, but should your home be burglarized or destroyed by fire that external drive is very likely to be just as missing/destroyed as your computer. The peace of mind that can be had from cloud backups should far outweigh the cost. I think of it as an insurance plan – the premiums fees paid for online backup may be a pain, but are far less than the cost of losing priceless data.