The commands in the script seems to be of "Azure ASM model". When I run the script, I get the error: Get-AzureStorageAccount : No default subscription has been designated. Use Select-AzureSub scription -Default <subscriptionNa me> to set the default subscription. At line:1 char:1 + Get-AzureStorag eAccount -StorageAccount Name "manjusqlserver 1" + ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~~~~~~~~~~~~~~~ ~ + CategoryInfo : CloseError: (:) [Get-AzureStora geAccount], ApplicationExce ption + FullyQualifiedE rrorId : Microsoft.Windo wsAzure.Command s.ServiceManage ment.StorageSer vices. GetAzureStorage AccountCommand
So even after changing the cast as other have mentioned I am getting "Get-BlobBytes : Unable to find type [Microsoft.WindowsAzure.Comman ds.Storage.Mode l.ResourceModel .AzureStorageBl ob]. Make sure that the assembly that contains this type is loaded." if I make some modifications I end up with: "Get-BlobBytes : ### Exception calling "GetPageRanges" with "2" argument(s): "Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host."" Anyone by chance found a way to run this? or what am I missing?
The cast type actually depends on the version of the powershell azure module you have installed. The script was working perfectly until I upgraded the azure module from 0.9x to 4.0, and now the blob type cast changes to Microsoft.WindowsAzure.Command s.Common.Storag e.ResourceModel .AzureStorageBa se. But changing the cast to that may break other things. For some vhds, I can't measure the size of the vhd directly, so I have to create a snapshot and calculate the size of the snapshot. But with 4.0, blobs and snapshots have different type casts and so you can't use the same function to calculate the size of both. Tried to work around it and decided it was easier to just run the script on a system that still had 0.9x.
Doesn't the following code to get all the blobs in a container also include any snapshots? Won't that incorrectly increase the size of the calculations since snapshots by themselves don't really take up any additional storage? $blobCount = 0 Get-AzureStorageBlob -Context $storageContext -Container $Container.Name | ForEach-Object { $containerSizeI nBytes += Get-BlobBytes $_ $blobCount++ } I think the fix would be: $blobCount = 0 Get-AzureStorag eBlob -Context $storageContext -Container $Container.Name | where-object {$_.snapshottim e -eq $null} | ForEach-Object { $containerSizeI nBytes += Get-BlobBytes $_ $blobCount++ } Also, I would often get this error with some premium storage blobs: "The remote server returned an error: (409) Conflict." Checking further, it looks like the actual error is "StatusMessage: This blob is being used by the system.". The fix is to create a temporary snapshot and calculate the size of the snapshot instead.
For premium storage, you should be charged for the blob size allocation, despite actual usage. But I think this script still calculates from actual usage.
I'm getting the following error for page blobs: Exception calling "GetPageRanges" with "0" argument(s): "Unable to read data from the transport connection: The connection was closed." In Fiddler, I see it sends the same request three times (probably retry), and every time it gets a chunked response, without the zero-chunk at the end. Sounds like a problem with the REST service. Any ideas?
VMs are stored as pageblobs which must have their pages enumerated to calculate the size. The function getPageRanges can be passed parameters for offset and length to return the pages piece by piece. Replace the Get-blobBytes function with this: function Get-BlobBytes { param ( [Parameter(Mandatory=$true)] [Microsoft.Wind owsAzure.Comman ds.Storage.Mode l.ResourceModel .AzureStorageBl ob]$Blob) # Base + blob name $blobSizeInByte s = 124 + $Blob.Name.Leng th * 2 # Get size of metadata $metadataEnumer ator = $Blob.ICloudBlo b.Metadata.GetE numerator() while ($metadataEnume rator.MoveNext( )) { $blobSizeInByte s += 3 + $metadataEnumer ator.Current.Ke y.Length + $metadataEnumer ator.Current.Va lue.Length } if ($Blob.BlobType -eq [Microsoft.Wind owsAzure.Storag e.Blob.BlobType ]::BlockBlob) { $blobSizeInByte s += 8 $Blob.ICloudBlo b.DownloadBlock List() | ForEach-Object { $blobSizeInByte s += $_.Length + $_.Name.Length } } else { [int64]$rangeSi ze = 1GB [int64]$start = 0; $pages = "Start"; While ($pages){ $pages = $Blob.ICloudBlo b.GetPageRanges ($start, $rangeSize) $pages | ForEach-Object { $blobSizeInByte s += 12 + $_.EndOffset - $_.StartOffset } $start += $rangeSize } } return $blobSizeInByte s }
Thanks Shane, your update avoids the timeout issue causing Ido's error. The update does fail when the blob size is within 1 x the rangesize of max blob size, because the next iteration is out of range. Here is a fix { [int64]$rangeSize = 1GB [int64]$start = 0; $pages = "Start"; While ($pages) { try { $pages = $Blob.ICloudBlo b.GetPageRanges ($start, $rangeSize) } catch { if ($_ -like "*the range specified is invalid*") { $pages = $null break } else { write-error $_ } } $pages | ForEach-Object { $blobSizeInByte s += 12 + $_.EndOffset - $_.StartOffset } $start += $rangeSize } }
With the above changes, now I get this error often: Get-BlobBytes : ### Exception calling "GetPageRanges" with "2" argument(s): "Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host." At line:156 char:38 + $containerSizeInBytes += Get-BlobBytes $_ + ~~~~~~~~~~~~~~~ ~ + CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorExcep tion + FullyQualifiedE rrorId : Microsoft.Power Shell.Commands. WriteErrorExcep tion,Get-BlobBy tes Is this one of those cases where azure is overloaded and I need to have the command retry?
Exception calling "GetPageRanges" with "2" argument(s): "The remote server returned an error: (416) The range specified is invalid for the current size of the resource.." At line:30 char:13 + $pages = $Blob.ICloudBlob.GetPageRanges ($start, $rangeSiz ... Anyone have idea about the error 416 ? REF link https://blogs.m sdn.microsoft.c om/windowsazure storage/2012/03 /26/getting-the -page-ranges-of -a-large-page-b lob-in-segments / I have tried to adjust the range become 150 MB but no luck Thanks Eric
Hi, I have run this script against a number of different storage accounts, and every time the price of the HDInsight is the same, at £436.33. This is odd because we don't even use HDInsights. Is anyone else having this issue? Thanks Charlie
I always encounter error "Get-BlobBytes : Unable to find type [Microsoft.WindowsAzure.Manage ment.Storage.Mo del.ResourceMod el.AzureStorage Blob]. Make sure that the assembly that contains this type is loaded. At C:\CalculateBlo bCost2.ps1:132 char:38" . Then I modified line 68 the type to Microsoft.Windo wsAzure.Command s.Storage.Model .ResourceModel. AzureStorageBlo b and every thing 's ok. I hope this can help others. Very useful sample. Thanks for posting.
I was able to run the script by removing the [Microsoft.WindowsAzure.Storag e.Blob.CloudBlo ckBlob] cast in front of the $blob variable in the Get-BlobBytes function.
After investigation I found the cast should be set to: [Microsoft.WindowsAzure.Comman ds.Storage.Mode l.ResourceModel .AzureStorageBl ob]
I just tested Nurun solution and it works like a charm :) - Thanks for sharing!
The script does indeed have a bad cast. Line 62 should read: [Microsoft.WindowsAzure.Comman ds.Storage.Mode l.ResourceModel .AzureStorageBl ob]$Blob)
Unfortunately I have no experience of PowerShell scripts at all, and so am unable to troubleshoot this: After running for some time the script throws "Get-BlobBytes : Unable to find type" at 124 char 38 apparently for every entry - I end up cancelling it.