I created this script for a client who wanted to change their off-site backups from Tapes to Azure Blob storage. The total amount of data to be migrated to Azure was approximately 40TB and for this to align appropriately with their on-premise storage structure, we would spread this across 6 Blob containers. This off-site data would only need to be accessed in a DR scenario, and there are no incremental file changes, just nightly uploads of new files. So to get the best value we decided that all data uploaded should be set in the 'Archive' Blob Access Tier.

Currently (March 2018) there is no option within the Storage service in Azure to create a 'Archive' access tier storage account. My only option was to create a 'Hot' or 'Cold' tier storage account, and subsequent containers, and then individually set each blob (file) within the containers to the 'Archive' Tier. Through some online recommendations, I found that the best option was to migrate all data into a Hot storage account/container and then change each file to the 'Archive' access tier. The reason for this is that you are not charged for moving blobs from the 'Hot' tier, but you are charged for moving blobs from the 'Cold' tier.

I started uploading the on-premise files to the 'Hot' storage account using the AZCOPY utility, which worked as expected.

Below where my requirements

So I started to develop my script, which pretty much revolved around the Get-AzureStorageBlob cmdlet! As the size of the containers grew, I soon realised that the SHELL and cmdlet where unable to process large amount of datasets at one time, so I came across an article describing the use of the Continuation Token within the Get-AzureStorageBlob cmdlet. This allows for blobs to be pulled down in smaller batches or sub-sets to be processed incrementally.

I ran this script every hour during the initial migration phase, and now every night on each Blob container.

I'm not the most savvy powershell scripter, so you'll probably come across some unnecessary bulk in my script! But I hope parts of it can be useful to others in the community and any feedback, positive or negative, would be greatly appreciated.

Thanks

Brendan

 

PowerShell
Edit|Remove
Import-Module Azure 
 
#Define storage account information 
$StorageAccount = "storageaccountname" 
$StorageAccountKey = "############################################" 
$containername = "container1" 
  
#Create a storage context 
$context = New-AzureStorageContext -StorageAccountName $StorageAccount -StorageAccountKey $StorageAccountKey 
 
#Set variables for processing batches & continuation Token 
 
$MaxReturn = 50000 
$Token = $Null 
 
#Define a blog array for reporting 
$blobarray = @() 
 
#Create a loop to process the whole container in blob batches of 50,000 
 
do 
 { 
      
     #Process a total of 50,000 Blobs at a time. This is extremely useful for large containers 
     $blobs = Get-AzureStorageBlob -Container $containername -Context $context -MaxCount $MaxReturn -ContinuationToken $Token 
 
     #I schedule this script to run every hour, so I've configured the below filter to only process specific blobs. NBLOBS is short for New Blobs! 
      
     $nblobs = $blobs | where {$_.LastModified -gt (Get-Date).AddMinutes(-90)} | Where-Object {$_.ICloudBlob.Properties.StandardBlobTier -eq 'Hot'} 
 
     # A 'For' loop to process the filtered out blobs 
 
     foreach($nblob in $nblobs) { 
 
        #Change the access tier of the newly uploaded blogs 
 
        $nblob.ICloudBlob.SetStandardBlobTier("Archive") 
 
        #Add these blobs to our array 
 
        $blobarray +$nblob 
 
                }  
      
     if($blobs.Length -le 0) { Break;} 
 
     $Token = $blobs[$blobs.Count -1].ContinuationToken; 
 } 
 While ($Token -ne $Null) 
 
#Export results of changed blogs to CSV file 
 
$timestamp = Get-date -UFormat %d%m%y%H%M 
 
$fulldate = Get-Date -Format g  
 
$export = "C:\temp\Blob Tier Updates - $containername $timestamp.csv" 
 
$blobarray | Select-Object -Property Name, BlobType, LastModified, Length, ContentType, @{n='AccessTier';e={$_.ICloudBlob.Properties.StandardBlobTier}} | Export-Csv $export -NoTypeInformation 
 
#Email CSV file to pre-determined recipients 
 
Start-Sleep -s 5 
 
$smtpServer ="8.8.8.8" 
$file = $export 
$att = new-object Net.Mail.Attachment($file$msg = new-object Net.Mail.MailMessage 
$smtp = new-object Net.Mail.SmtpClient($smtpServer$msg.From = "name@test.com" 
$msg.To.Add("name1@test.com"$msg.Subject = "$timestamp : Azure Blob Storage Updates" 
$msg.Body = "Report attached for Blob Tier Updates on $containername Storage container on $fulldate" 
$msg.Attachments.Add($att$smtp.Send($msg$att.Dispose()