Jul 122018

As a former DBA, it should be no surprise that I am a big fan of keeping data safe, and not just corporate data – this extends to my personal data as well. Early on I realized that the cloud was a great way to protect all my photos, videos, and other digital keepsakes that I absolutely could not stand to lose. As cloud offerings matured, products specific to archival were introduced that allowed for long-term storage at very low price points. The tradeoff for this lower storage cost is that should you need to retrieve the data, it is not immediately available and you may have to wait a few hours. The ideal use case for these products, however, is that the data never needs to be retrieved at all, it is simply an additional copy being stored for safekeeping.

Two of the products I use extensively for this purpose are Amazon Glacier and, more recently, Microsoft Azure Blob Storage Archive Tier. As happy as I’ve been with Amazon Glacier since its introduction in 2012, I always hoped Microsoft would offer a similar service. My wish came true in Fall of 2017 when an archive tier of Azure Blob Storage was announced. Rather than branding this capability as a new product, Microsoft decided to present it as a new tier of Azure Blob Storage, alongside the existing hot and cool storage tiers.

A noticeable difference from the hot and cool storage tiers is that the archive storage tier is only available on a per-blob basis. While a storage account can be configured to have all blobs placed in either the hot or cool tier by default once they are uploaded, the archive tier is not an option. Once a blob is uploaded, it must explicitly be moved into the archive tier. If one is using the Azure Portal to do this, there’s several clicks involved per blob. The free Azure Storage Explorer client is no better. While I found several third party tools that can upload files to the archive tier, none were free. At this point, I decided to write my own method using Powershell, which I am happy to share below.

If you’re already an Azure pro, feel free to skip ahead. But if you’re new to Azure or don’t already have a storage account, follow along with me to get one set up.

Creating a Storage Account

First, log in to the Azure Portal. If you don’t already have an Azure account, you’ll need to create one.

Once you’re into Azure, you’ll need a storage account. If you don’t already have one, these are simple to create. In the Azure Portal, select “Storage accounts”


On the storage account screen, click “Add”


Next, enter the details for your storage account.

You’ll need to give it a name which is globally unique, so it may take you a few tries. For all new accounts Microsoft recommends the “Resource manager” model. The account kind is “Blob storage”. Choose whichever location you like, but realize that one which is closer to you will probably be faster. Since I’m doing archiving, I opt for standard performance, and set the access tier to Cool by default. I like security, so I require secure transfers. Select your subscription or create one if you need to, and do the same for your resource group. Then click “Create”.


Once created, your storage account will appear in the list. Now we need to create a container within that account. Click on the storage account, and then choose “Containers” from the menu on the left. Then click the “+” at the top to add a new container

(click to enlarge)


Give your container a name, and select an access level. Since I share my archives with no one, I make my containers Private. Click OK.


Your container will now appear in the list. Clicking on the container will show you the container is empty, and you will see that you can upload files to it right through the portal interface.

(click to enlarge)


Finally, for the Powershell script to work, you will need a key to access the storage account. Go back to the storage account menu and select “Access keys”. You will see two access keys provided for your account. Copy one of them for pasting into the script below.

(click to enlarge)

The Script

Once you have a storage account setup, PowerShell makes the uploading part simple. This script requires the Azure PowerShell module, and to install that you can find instructions here.

Don’t forget to copy in the name of your storage account (in the case of this demo I would paste “teststore700”) and the access key!

You can also find this script on my GitHub.

This script has just a few parameters, which are as follows:

Parameter Description
File Name of the file or files to be uploaded. These can also be piped in.
ContainerName Name of the container to upload to
DestinationFolder Name of the (virtual) destination folder inside the container. I like everything to be in a folder, so I kept that in mind when scripting this out.
ConcurrentTasks [optional] How many threads should upload the file. This defaults to 1. If you have lots of bandwidth feel free to increase it. Maximum value of 20.
BlobTier [optional] Which tier of blob storage this file should be set to. This value defaults to “Cool”.
Possible Values: {“Hot”, “Cool”, “Archive”}


And finally, some use examples:


Happy archiving!

Feb 222011

If you’ve been following my blog, you’ll know that I’ve written quite a few posts about the wonders of cloud backups and why you should be doing them to protect your precious data and memories. This doesn’t just apply to personal computer use – businesses should be taking advantage of offsite backups as well. Large businesses typically have this under control, but I’ve found that small businesses (especially those without an IT staff) tend to be the most vulnerable. Owners of many small businesses know just enough technology to do what they need, and rarely have the time or desire to keep up with changing times and best practices. While I don’t hold this against anyone, it is rather unfortunate.

Fire Alarm BoxA few months ago, a large bridal shop near me burned to the ground. This shop was of particular significance because Michelle ordered her wedding dress from there. We were very fortunate that her dress had not yet been made, so it wasn’t at the shop. Many others weren’t nearly as lucky. The business was a total loss, and with it went thousands of dresses, many of which were being held for upcoming weddings. The fire was on a Wednesday and many brides were scheduled to pick up their dresses that night for weddings taking place that weekend. Some literally arrived to pick up their dress and saw the building in flames. Through the generosity of other bridal shops in the area, I believe everyone was able to find *a* dress for their wedding, though obviously not the one they had ordered.

So what does this have to do with cloud backups? Nothing. The cloud can do many things, but putting out fires and replacing burned wedding dresses is not among them. Since there was plenty of time until our wedding, Michelle decided to wait a few weeks for things to die down and the business to set up a temporary space before calling and seeing what the deal was with her dress. We were assured that the dress order (placed several months earlier) was at the factory and we had nothing to worry about regarding its delivery in time for our wedding. That being said, they also informed us that they had lost absolutely all of their data, and were asking that we please fax or email over any paperwork we had including order forms and receipts.

This was not an issue for us at all, as we’ve been keeping all of our wedding-related info (including scanned forms) in Google Docs and sharing it between our accounts. This has been incredibly helpful, and I’ll have to blog about it all sometime in the future. We were able to email them everything we had within the hour. A little while later I got to thinking about the tremendous loss this business just incurred. Not only did they lose their inventory and their building, but all their customer and order data as well. Anyone whose name was on a list to get called back about something probably never heard from them. We were honest and sent back the forms reflecting how much money we still owed for the dress, but if they truly lost everything (and we didn’t have a conscience) we could have very easily doctored that receipt to say that everything was paid in full.

Once again, the cloud couldn’t have prevented the fire and I feel terrible for all the brides who were thrown a curve ball at the last minute and couldn’t get married in the dress of their dreams, but to me it’s equally sad that all the data loss that followed could have been prevented for a few dollars a month. Disasters like this are exactly what cloud backup can prevent. To all the small business owners out there: Your business is your data – treat your data like your job depends on it!

Feb 032011

A while back I did a few posts covering my favorite cloud backup solutions, and one of my favorites was Mozy. That very well may change now that Mozy has announced they are changing their pricing structure. They claim that “the backup market has changed” since 2006 due to people taking more photos and videos than ever before, and even though the majority of their users back up 50GB or less, the few that greatly exceed that number are ruining it for everyone. Gone are Mozy’s days of backing up unlimited data for a flat rate.

Instead of $4.95 per month per computer for unlimited data backup, Mozy is now charging $5.99 per month for 50GB of backup space for 1 computer, or $9.99 per month for 125GB of space shared between up to 3 computers. Need more space? You can add to the $9.99 plan in increments of 20GB for $2 per month, and additional computers can be added for that same monthly rate. As before, there are discounts if you pre-pay for 1 or 2 years.

MoneyI wasn’t thrilled about how Mozy is giving its user base a “one-two punch” of raising prices and reducing value, and judging by some of the comments over at the Mozy Community Discussion Boards it looks like I’m not alone. Shouldn’t storage only be getting cheaper with time? I understand that enterprise-class storage isn’t exactly as simple as picking up a bunch of hard drives from your local geek store, but disk space in general is a lot cheaper than it used to be.

In defense of Mozy, they still are cheaper than using cloud storage such as Amazon S3. Mozy’s giving users 125GB for $9.99 a month. As of S3’s rates today, you’d pay $.14 per GB/Month or $17.50 and that’s just for the storage (don’t forget they also charge for transferring the data to them). On the other hand, there are many competitors who still offer unlimited cloud backup for a flat rate, and I’d imagine they are seeing a lot of new business from disgruntled Mozy users. Some of them, such as CrashPlan and Backblaze are even offering a discount for those who switch from Mozy.

Another option for cloud backup that has the potential to be awesome is Google Paid Storage. Their prices are amazing – currently $0.25 per GB per year, and they’ll sell you up to 16TB of space! The downside is that there’s no easy way to back up to this space (sometimes I wonder if this is on purpose). You can utilize their space by uploading files to Google Docs, however that’s neither fast nor convenient. People have been hoping for a Google online storage service (usually referred to as “G-Drive”) for a while now and there’s still no sign of it coming, but online backups would be another great way for them to make a killing.

Even with this recent price-jacking, are online backups still worth it? Absolutely. If your computer is stolen or goes up in smoke you will have no trouble replacing the hardware, installed applications, or music, but what you really need to protect are the things that can’t be replaced such as photos and videos. Things like that are priceless, and the best way to protect them is to keep a copy somewhere far far away, like the cloud. Backing up to an external hard drive is fine, but should your home be burglarized or destroyed by fire that external drive is very likely to be just as missing/destroyed as your computer. The peace of mind that can be had from cloud backups should far outweigh the cost. I think of it as an insurance plan – the premiums fees paid for online backup may be a pain, but are far less than the cost of losing priceless data.

Jul 292010

This is the fourth post in my series on cloud backup solutions.  Previous posts were:

Today’s topic is Google’s Picasa Web Albums, henceforth known as PWA.  Unlike the other solutions I’ve covered, PWA is not explicitly intended to be a backup solution.  It’s designed as a platform for sharing your photos and videos with others, however it also does an excellent job of storing them in the cloud and letting you retrieve them with ease.  I’ve been using it as a form of backup for about a year now, and after browsing the “help” section I see that Google has included notes on how to use PWA as a backup as well.


The best way to upload photos to PWA is to use Picasa, Google’s desktop photo organizer.  I have tens of thousands of photos in my collection and have tried many different photo organization applications and Picasa is the best I’ve found so far.  Its facial recognition features are particularly amazing.  I could probably write an entire post about Picasa and its features, but I’ll save that for another time – I’m here to talk about saving photos to the cloud today.

Pricing & Storage

PWA accounts come with 1GB of free storage.  After that, you can pay for additional storage at Google Paid Storage’s paltry rates.  Any additional space you purchase will be shared with other Google products, such as Gmail and Google Documents.

Within PWA, photos/videos are stored in albums.  At the moment, you can have up to 10,000 albums, each storing up to 1,000 photos.  This is regardless of how much storage you purchase.


CloudsPWA is designed for public sharing, so you won’t find any kind of encryption included like you will for other backup tools.  PWA does have security settings for who is allowed to view your albums though.  You can adjust these settings at the album level, and choose to share with:

  • everybody
  • nobody
  • individuals with authentication (they must have a Google login)
  • the public through a link (anyone who has the URL can view the album)


With PWA, it’s not really a “backup”, more like an “upload”.   There’s a whole bunch of ways to get your photos into PWA, including the web interface, email, and mobile device apps.  Most of these methods have limitations of how much can be uploaded at a time.  The best way to upload large quantities of photos into PWA is to use the Picasa client.

I sort all of my photos by event, so I already have folders on my computer with names such as “Mom’s Birthday” or “Thanksgiving 2009”.  In Picasa, I create an album for each event and then add the appropriate photos to it.  Then you simply click “Sync to Web” and the album is uploaded to PWA.  Picasa has several options for what quality the photos are uploaded at.  I prefer to use full quality, and don’t mind the fact that it also takes up the most storage space since it’s rather cheap anyway.


To “restore” (download from PWA) you again have options.  From the PWA website you can download individual photos just as you would download any other file from the web.  If you have Picasa installed on your computer, you also have the option of downloading entire albums at a time through the web interface, or downloading multiple albums (or all your albums) through the Picasa client.  Simply go to File > “Import from Picasa Web Albums” and you can select which album(s) you want to download.  There are also third-party tools available for downloading albums without having Picasa installed.

What I Like

I’m a huge fan of Google’s amazing prices on storage space.  That being said, a low price is a horrible reason to choose a backup solution.  To me, PWA is primarily a great way to share photos with friends and family, and the fact that it can double as a cloud backup is an added bonus.

As I said before, I also really like the Picasa client as it has an amazing set of tools for organizing photos.  In addition to uploading images to PWA, you can do geotagging, use facial recognition to tag people, tag images with keywords, and search and sort by date and a variety of other fields.  It can take a little while to get things setup the way you like them, but once you do, maintaining your photo library is a snap! (pun intended)

What I don’t like

My biggest complaint about PWA is that albums can’t be nested, or created inside other albums.  My previous photo site (which I hosted out of my house) allowed this, and I had a wonderfully organized tree of all my photos.  Now all my albums are at the same level and simply sorted by date.  I know lots of people have requested the ability to nest albums, and hope the folks at Google get around to adding that feature soon.

Jul 222010

This is the third post in my series on cloud backup solutions.  Previous posts were:

Today I’ll be talking about another cloud backup application called Jungle Disk.  I’ve been experimenting with it for a few months and am generally very happy with it.  Much like Mozy, Jungle Disk allows you to intelligently backup your files into the cloud.  On the contrary, Jungle Disk only provides the client application for running backups, the actual storage of the backups is separate, which I’ll explain in more detail shortly.

Client Versions & Pricing

Jungle Disk has 4 different client versions depending on your needs.  Two of them are considered to be for personal use, the others are for businesses.  The personal editions are called “Simply Backup” and “Desktop Edition” and are respectively priced at $2 per month and $3 per month.  Both include 5GB of free backup space.  Simply Backup lives up to its name – it allows you to run either manual or scheduled backups of whatever files/folders you like on as many machines as you like.  Desktop Edition builds on that and allows you to access your Jungle Disk storage as a network drive.  It also has the ability to sync files between multiple machines.  Since I had no need for the features of Desktop Edition (and I’m also a tightwad) I have been using Simply Backup.

The business editions are “Workgroup Edition” and “Server Edition” and are priced at $4 and $5 per month respectively, both including 10GB of free backup space.  Workgroup Edition includes a multi-way sync feature so an entire group of people can keep files in sync between them.  Server Edition includes remote management features.


NotJungleDiskAs I mentioned earlier, Jungle Disk only provides a client application for creating backups.  You have a choice as to where those backups are stored, as Jungle Disk supports both Amazon S3 and Rackspace Cloud Files.  Prices differ based on which service you choose.  (I’ve been using S3 for my backups.)

Since you’re paying the storage provider based on the amount of data stored, you have the option of how long to retain your backups before they’re deleted.  By default it’s set to 30 days, which seems way too short to me.  If you delete a file and don’t realize it until 31 days later, you would be out of luck because the last backup containing that file would have been deleted.  I currently have mine set to keep each backup for a year, but am considering disabling this option altogether and just keeping all backups forever.


Jungle Disk uses AES-256 encryption for your data, the key for which is based on a password you choose.  Don’t lose your password, otherwise you’ll be out of luck!


Backups are pretty simple – just select which folders and/or individual files you want to backup.  There’s a built-in scheduler as well as a bandwidth throttle.  Like most systems, the client will only attempt to backup files that have changed since the last backup.  It also features de-duplication technology, so only the parts of files that have changed are backed up.  This can be particularly helpful if you have a large file with a small part of it that has changed.


Restores are a snap.  You simply select the file(s)/folder(s) you want to restore, the backup date you wish to restore from, and the location you wish to restore them to.  I found restores to be quick and painless.  In Simply Backup, restores can only be done from the client.  In Desktop Edition and beyond, you have the option of restoring files from the web as well.

What I Like

Since your backup files are stored by third parties, the availability of your backups is subject to their guarantees.  Both Rackspace and Amazon S3 have pretty good SLAs.

A nifty feature Jungle Disk provides is backup reports via RSS.  You are provided with a link to a private RSS feed that is updated each time a backup runs.  It’s especially convenient for me as I have my backups set to run during the day while I’m at work.  I can see that my backup jobs have completed, how long they took and how many files were backed up all from the comfort of my RSS reader.

What I Don’t Like

I believe Jungle Disk could be doing a better job of selling itself, or at least letting prospective buyers figure out what they want.  They offer 4 different products at different prices, but there’s no easy way to compare them to each other.  Each product has its own page with a few paragraphs about some of the features it offers, but what they really need is a chart showing all the differences between the versions.

Another thing I don’t like is that while Jungle Disk does de-duplication, it doesn’t do de-duplication across computers.  If you have the same file on 2 computers, you’ll be backing up and storing 2 identical copies of that file.  Jungle Disk’s de-duplication takes place within what they call a “Backup Vault”, but only a single computer can store its backups within a given backup vault.  If you’re only backing up 1 computer this shouldn’t be an issue, however if you have multiple machines with identical files on them, you’ll be paying for more storage than you really need.

Next Cloud Backup Product Review: Picasa Web Albums