Get Billable Size of Windows Azure Blobs (w/Snapshots) in a Container or Account

Enumerates all blobs in either one container or one storage account and sums up all costs associated. This includes all block and page blobs, all metadata on either blobs or containers.

 
 
 
 
 
4 Star
(4)
4,990 times
Add to favorites
Windows Azure
10/16/2014
E-mail Twitter del.icio.us Digg Facebook
Sign in to ask a question


  • Get-BlobBytes error
    2 Posts | Last post May 26, 2017
    • So even after changing the cast as other have mentioned I am getting "Get-BlobBytes : Unable to find type [Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob]. Make sure that the assembly that contains this type is loaded." 
      
      if I make some modifications I end up with:
      
      "Get-BlobBytes : ### Exception calling "GetPageRanges" with "2" argument(s): "Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host."" 
      
      
      Anyone by chance found a way to run this? or what am I missing? 
      
    • The cast type actually depends on the version of the powershell azure module you have installed. The script was working perfectly until I upgraded the azure module from 0.9x to 4.0, and now the blob type cast changes to Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBase. But changing the cast to that may break other things. For some vhds, I can't measure the size of the vhd directly, so I have to create a snapshot and calculate the size of the snapshot. But with 4.0, blobs and snapshots have different type casts and so you can't use the same function to calculate the size of both. Tried to work around it and decided it was easier to just run the script on a system that still had 0.9x.
  • Bugs?
    1 Posts | Last post September 13, 2016
    • Doesn't the following code to get all the blobs in a container also include any snapshots? Won't that incorrectly increase the size of the calculations since snapshots by themselves don't really take up any additional storage?
      
          $blobCount = 0 
          Get-AzureStorageBlob -Context $storageContext -Container $Container.Name |  
              ForEach-Object {  
                  $containerSizeInBytes += Get-BlobBytes $_  
                  $blobCount++
                  } 
      
      I think the fix would be:
      
          $blobCount = 0 
          Get-AzureStorageBlob -Context $storageContext -Container $Container.Name |
              where-object {$_.snapshottime -eq $null} |
              ForEach-Object {  
                  $containerSizeInBytes += Get-BlobBytes $_  
                  $blobCount++
                  } 
      
      
      Also, I would often get this error with some premium storage blobs:
      "The remote server returned an error: (409) Conflict." Checking further, it looks like the actual error is "StatusMessage: This blob is being used by the system.". The fix is to create a temporary snapshot and calculate the size of the snapshot instead.
      
      
      
  • Does this work for Premium Storage?
    1 Posts | Last post August 25, 2016
    • For premium storage, you should be charged for the blob size allocation, despite actual usage. But I think this script still calculates from actual usage.
  • Getting an error when calling GetPageRanges
    4 Posts | Last post May 31, 2016
    • I'm getting the following error for page blobs:
      Exception calling "GetPageRanges" with "0" argument(s): "Unable to read data from the transport 
      connection: The connection was closed."
      
      In Fiddler, I see it sends the same request three times (probably retry), and every time it gets a chunked response, without the zero-chunk at the end.
      
      Sounds like a problem with the REST service.
      
      Any ideas?
    • VMs are stored as pageblobs which must have their pages enumerated to calculate the size. The function getPageRanges can be passed parameters for offset and length to return the pages piece by piece.
      Replace the Get-blobBytes function with this:
      
      function Get-BlobBytes
      {
          param (
              [Parameter(Mandatory=$true)]
              [Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob]$Blob)
       
          # Base + blob name
          $blobSizeInBytes = 124 + $Blob.Name.Length * 2
       
          # Get size of metadata
          $metadataEnumerator = $Blob.ICloudBlob.Metadata.GetEnumerator()
          while ($metadataEnumerator.MoveNext())
          {
              $blobSizeInBytes += 3 + $metadataEnumerator.Current.Key.Length + $metadataEnumerator.Current.Value.Length
          }
       
          if ($Blob.BlobType -eq [Microsoft.WindowsAzure.Storage.Blob.BlobType]::BlockBlob)
          {
              $blobSizeInBytes += 8
              $Blob.ICloudBlob.DownloadBlockList() | 
                  ForEach-Object { $blobSizeInBytes += $_.Length + $_.Name.Length }
          }
          else
          {
              [int64]$rangeSize = 1GB
              [int64]$start = 0; $pages = "Start";
              
              While ($pages){
                  $pages = $Blob.ICloudBlob.GetPageRanges($start, $rangeSize)             
                  $pages | ForEach-Object { $blobSizeInBytes += 12 + $_.EndOffset - $_.StartOffset }        
                  $start += $rangeSize           
              }
          }
      
          return $blobSizeInBytes
      }
    • Thanks Shane, your update avoids the timeout issue causing Ido's error.
      The update does fail when the blob size is within 1 x the rangesize of max blob size, because the next iteration is out of range. Here is a fix
      
          {
              [int64]$rangeSize = 1GB
              [int64]$start = 0; $pages = "Start";
              
              While ($pages)
              {
                  try
                  {
                   $pages = $Blob.ICloudBlob.GetPageRanges($start, $rangeSize)
                  }
                  catch
                  {
                   if ($_ -like "*the range specified is invalid*")
                   {
                    $pages = $null
                    break
                   }
                   else
                   {
                    write-error $_
                   }
                  }
                  $pages | ForEach-Object { $blobSizeInBytes += 12 + $_.EndOffset - $_.StartOffset }
                  $start += $rangeSize
              }
          }
      
    • With the above changes, now I get this error often:
      
      Get-BlobBytes : ### Exception calling "GetPageRanges" with "2" argument(s): "Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host."
      At line:156 char:38
      +             $containerSizeInBytes += Get-BlobBytes $_
      +                                      ~~~~~~~~~~~~~~~~
          + CategoryInfo          : NotSpecified: (:) [Write-Error], WriteErrorException
          + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Get-BlobBytes
      
      Is this one of those cases where azure is overloaded and I need to have the command retry?
  • About the getPagerangers error 416
    1 Posts | Last post May 10, 2016
    • Exception calling "GetPageRanges" with "2" argument(s): "The remote server returned an error: (416) The range specified is invalid for the current size of the resource.."
      At line:30 char:13
      +             $pages = $Blob.ICloudBlob.GetPageRanges($start, $rangeSiz ...
      
      
      Anyone have idea about the error 416 ? 
      REF link 
      https://blogs.msdn.microsoft.com/windowsazurestorage/2012/03/26/getting-the-page-ranges-of-a-large-page-blob-in-segments/
      I have tried to adjust the range become 150 MB but no luck
      
      Thanks 
      Eric 
  • HDINSIGHTS Billing
    1 Posts | Last post January 06, 2016
    • Hi, I have run this script against a number of different storage accounts, and every time the price of the HDInsight is the same, at £436.33. This is odd because we don't even use HDInsights. Is anyone else having this issue? 
      
      Thanks
      Charlie
  • fix a bug in code
    5 Posts | Last post August 31, 2015
    • I always encounter error "Get-BlobBytes : Unable to find type [Microsoft.WindowsAzure.Management.Storage.Model.ResourceModel.AzureStorageBlob]. Make sure that the assembly that contains this type is loaded.
      At C:\CalculateBlobCost2.ps1:132 char:38" . Then I modified line 68 the type to Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob  and every thing 's ok. I hope this can help others. 
      
      Very useful sample. Thanks for posting. 
    • I was able to run the script by removing the [Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob] cast in front of the $blob variable in the Get-BlobBytes function.
    • After investigation I found the cast should be set to:
      [Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob]
    • I just tested Nurun solution and it works like a charm :) - Thanks for sharing!
    • The script does indeed have a bad cast. 
      
      Line 62 should read:
              [Microsoft.WindowsAzure.Commands.Storage.Model.ResourceModel.AzureStorageBlob]$Blob)
      
  • "Unable to find type" error
    1 Posts | Last post July 17, 2014
    • Unfortunately I have no experience of PowerShell scripts at all, and so am unable to troubleshoot this:
      
      After running for some time the script throws "Get-BlobBytes : Unable to find type" at 124 char 38 apparently for every entry - I end up cancelling it.