Security Monitoring and Detection Tips for your Storage Account – Part 2

The previous part of the series introduced you three different types of log that Azure Storage account provide. Each of them can be used for different purpose but can be correlated together for a single view. Understanding every piece of information they give will definitely help for your security monitoring capability.

In this article, let’s explore where logs are stored, how to collect them and an approach building a single Storage account log repository.

Operation Log

As explained in the previous article, Storage operation log is stored in Azure Activity and is retrievable directly from Azure Activity navigation in each resource.

It can be seen from this URL too. However this link gives you all logged activities in subscriptions you have access. For storage account specifically, you need a filter.

You can use PowerShell, Azure CLI, Python or other languages to retrieve activity. Below is the sample PowerShell script to help:

$resourceProvider = "Microsoft.Storage"
Get-AzureRmLog -StartTime (Get-Date).AddDays(-1d) `
               -ResourceProvider $resourceProvider `
               | ForEach-Object {
                   Write-Host -ForegroundColor White "--------------"
                   Write-Host -ForegroundColor Green "ResourceId:" $_.ResourceId.Split('/')[8]
                   Write-Host -ForegroundColor Blue "ResourceGroupName": $_.ResourceGroupName
                   Write-Host -ForegroundColor Yellow "Action:" $_.Authorization.Action
                   Write-Host -ForegroundColor White "--------------"

Bear in mind that the Activity Log events are stored for 90 days only. So any event that was created before that cannot be retrieved. You do need to export them to whether Storage account, Event Hub or Log Analytics workspace for longer retention.

Storage Metric

Storage Metric events are not stored in Azure Activity log. By default metrics are stored in the same storage account under Tables service. You can use Azure Storage Explorer to download and view content in each table.

You can also configure to send metric to a Log Analytics workspace by following this template here .

| where ResourceProvider == "MICROSOFT.STORAGE"
| where MetricName == "Transactions"
| summarize count() by bin(TimeGenerated, 1h)
| render timechart

Rendered timechart of storage account transactions with 1 hour interval

Storage Analytics

Unlike the other two types of event, Storage Analytics has no built-in option to write Storage Analytics log to a Log Analytics workspace or Event Hub (for SIEM integration). Moreover, it cannot be enabled via Azure ARM template as of this article. The reason is that this setting is controlled in data plane with storage SDK so it is not surfaced in Azure ARM operation.

You must enable it via Azure Portal, REST API or PowerShell.

Storage Analytics can be enabled from Diagnostic Setting (classic)

The following PowerShell can help to enable Storage Analytics logging for Blob service. You can do it with Table and Queue. File service is not supported.

$storageAccountName = "az66bex6otpb4ttk"
$storageAccountKey = "XXXXXXXX"
$retention = "15"
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName `
                               -StorageAccountKey $storageAccountKey
$ctx | Get-AzureStorageServiceLoggingProperty -ServiceType Blob
$ctx |Set-AzureStorageServiceLoggingProperty -ServiceType Blob `
                                             -LoggingOperations read,write,delete `
                                             -RetentionDays $retention

Storage Analytics log is stored in a folder named $log that can be accessed via Azure Storage Explorer (client application, not from Azure Portal).

Storage Analytics log seen from Azure Storage Explorer

Storage Account Log repository

There are pros and cons when choosing destination where Storage Account logs are streamed to. Storing log in another storage account is the cheapest option. However you processing data (e.g. querying a pattern, filtering only anonymous request…) can be time consuming. Sending your Storage account log to an Event Hub would be a good idea if your requirement is to provide near real-time monitoring and detection. Another aspect of the destination selection would be the SIEM integration. Most of the enterprise SIEM products today support Azure Event Hub integration so you wouldn’t have to worry about that.

What about Log Analytics? To me I’d highly recommend to build a single log repository for Storage Account. The huge advantage of this is the query capability powered by Kusto Query Language (KQL). Microsoft has vastly invested in KQL for data query and analytics. There are really helpful functions available for building historical data to detect anomalous and entropy request. Moreover, Log Analytics is the backbone data lake of of Azure Sentinel and you are going to need to have data available for hunting and exploration.

Simple Kusto query to list all blobs that were successfully accessed anonymously

The illustration below gives you an idea on centralizing storage account log into a single Log Analytics workspace

While Azure Activity and Metrics can be configured to via ARM template or Azure Portal to send events to a Log Analytics workspace, the Storage Analytics cannot be. You would need to read $log folder periodically and feed log file into a Log Analytics workspace via HTTP Data Collector API for example.

AzSec is in the progress of testing a script that can be used with Azure Automation Account and Azure Function. The script will be shared publicly inside this repository. In the meantime you can refer to this useful script for your evaluation. This script is lacking the function to avoid duplicated data though.


In this part, you have learned about collecting log and destination where logs should be sent to. In the next article, let’s explore some Storage account attack vector and how recorded events look like in your log repository, as well as some exercises to raise storage account related alerts that are caught by Azure Security Center – Advanced Threat Protection for Storage Account.

This entry was posted in Monitoring & Detection and tagged , . Bookmark the permalink.

7 Responses to Security Monitoring and Detection Tips for your Storage Account – Part 2

  1. Pingback: Security Monitoring and Detection Tips for your Storage Account – Part 1 | All about security on Microsoft Azure

  2. Pingback: Deploy a compliant Storage Account service | All about security on Microsoft Azure

  3. Pingback: Security Monitoring and Detection Tips for your Storage Account – Part 3

  4. Pingback: Enable storage account analytics logging on all storage accounts

  5. Eric says:

    Hey, interesting read! Did you ever finish testing that script to send storage analytics to log analytics workspace? I don’t seem to find it in your github repos.

  6. Pingback: Azure Sentinel Resource Terminus – board here! – Azure Sentinel News

Leave a Reply

Your email address will not be published. Required fields are marked *