Archiving Data with Azure Blob Storage Archive Tier and PowerShell

As a former DBA, it should be no surprise that I am a big fan of keeping data safe, and not just corporate data – this extends to my personal data as well. Early on I realized that the cloud was a great way to protect all my photos, videos, and other digital keepsakes that I absolutely could not stand to lose. As cloud offerings matured, products specific to archival were introduced that allowed for long-term storage at very low price points. The tradeoff for this lower storage cost is that should you need to retrieve the data, it is not immediately available and you may have to wait a few hours. The ideal use case for these products, however, is that the data never needs to be retrieved at all, it is simply an additional copy being stored for safekeeping.

Two of the products I use extensively for this purpose are Amazon Glacier and, more recently, Microsoft Azure Blob Storage Archive Tier. As happy as I've been with Amazon Glacier since its introduction in 2012, I always hoped Microsoft would offer a similar service. My wish came true in Fall of 2017 when an archive tier of Azure Blob Storage was announced. Rather than branding this capability as a new product, Microsoft decided to present it as a new tier of Azure Blob Storage, alongside the existing hot and cool storage tiers.

A noticeable difference from the hot and cool storage tiers is that the archive storage tier is only available on a per-blob basis. While a storage account can be configured to have all blobs placed in either the hot or cool tier by default once they are uploaded, the archive tier is not an option. Once a blob is uploaded, it must explicitly be moved into the archive tier. If one is using the Azure Portal to do this, there's several clicks involved per blob. The free Azure Storage Explorer client is no better. While I found several third party tools that can upload files to the archive tier, none were free. At this point, I decided to write my own method using Powershell, which I am happy to share below.

If you're already an Azure pro, feel free to skip ahead. But if you're new to Azure or don't already have a storage account, follow along with me to get one set up.

Creating a Storage Account

First, log in to the Azure Portal. If you don't already have an Azure account, you'll need to create one.

Once you're into Azure, you'll need a storage account. If you don't already have one, these are simple to create. In the Azure Portal, select "Storage accounts"

On the storage account screen, click "Add"

Next, enter the details for your storage account.

You'll need to give it a name which is globally unique, so it may take you a few tries. For all new accounts Microsoft recommends the "Resource manager" model. The account kind is "Blob storage". Choose whichever location you like, but realize that one which is closer to you will probably be faster. Since I'm doing archiving, I opt for standard performance, and set the access tier to Cool by default. I like security, so I require secure transfers. Select your subscription or create one if you need to, and do the same for your resource group. Then click "Create".

Once created, your storage account will appear in the list. Now we need to create a container within that account. Click on the storage account, and then choose "Containers" from the menu on the left. Then click the "+" at the top to add a new container

(click to enlarge)

Give your container a name, and select an access level. Since I share my archives with no one, I make my containers Private. Click OK.

Your container will now appear in the list. Clicking on the container will show you the container is empty, and you will see that you can upload files to it right through the portal interface.

(click to enlarge)

Finally, for the Powershell script to work, you will need a key to access the storage account. Go back to the storage account menu and select "Access keys". You will see two access keys provided for your account. Copy one of them for pasting into the script below.

(click to enlarge)

The Script

UPDATE: The script previously located in this post has been updated in a later blog post.

Don't forget to copy in the name of your storage account (in the case of this demo I would paste "teststore700") and the access key!

You can also find both versions of this script on my GitHub.

This script has just a few parameters, which are as follows:

Parameter Description
File Name of the file or files to be uploaded. These can also be piped in.
ContainerName Name of the container to upload to.
DestinationFolder Name of the (virtual) destination folder inside the container. I like everything to be in a folder, so I kept that in mind when scripting this out.
ConcurrentTasks [optional] How many threads should upload the file. This defaults to 1. If you have lots of bandwidth feel free to increase it. Maximum value of 20.
BlobTier [optional] Which tier of blob storage this file should be set to. This value defaults to “Cool”.
Possible Values: {“Hot”, “Cool”, “Archive”}

And finally, some use examples:

 1# single file
 2Upload-AzureBlob -ContainerName video -DestinationFolder HomeMovies `
 3-BlobTier Archive -File "G:\Video\Movie1.mp4"
 4
 5# multiple files
 6Upload-AzureBlob -ContainerName video -DestinationFolder HomeMovies `
 7-BlobTier Archive -File "G:\Video\Movie1.mp4","G:\Video\Movie2.mp4"
 8
 9# piped
10Get-Childitem *.mp4 | Upload-AzureBlob -ContainerName video `
11-DestinationFolder HomeMovies -BlobTier Cool

Happy archiving!