- Interact with Azure Cosmos DB with PowerShell - Tue, Sep 14 2021
- Azure health services: Track Microsoft cloud outages and maintenance - Wed, Sep 8 2021
- Powerline: Customize your PowerShell console - Tue, Aug 31 2021
Let's imagine you manage your company's Microsoft Azure subscription. You receive a request from an important business partner who needs to see a specific sales data file named jan2017.csv stored in the binary large object (blob) service of a storage account named sales4sysopsdata.
Corporate security policy prevents you from sending a copy of the requested file via email file attachment or FTPS. How can you solve the problem?
Today I will teach you how to use shared access signature (SAS) tokens to provide time-restricted access to blob resources in Azure storage accounts.
Setting the stage ^
In today's exercise, we will use Microsoft's free Azure Storage Explorer desktop application to grant our business partner her desired level of access to that sales file. Go ahead and install Storage Explorer, start the application, and authenticate to your subscription.
As you can see in the following screenshot, the jan2017.csv file is in a container named reports in the sales4sysopsdata storage account.
Creating your first SAS URL ^
In Storage Explorer, right-click jan2017.csv and select Get Shared Access Signature… from the context menu. The Shared Access Signature form includes the following fields:
- Access policy: A stored access policy is a way to manage multiple SAS tokens in the same container. We'll deal with this option later in today's tutorial.
- Start time: SAS URLs are time-limited; the best practice is to set the start time a few minutes before the current time to work around latency issues.
- Expiry time: Because a SAS URL is usable by anybody on Earth with that address, make the expiration time as short as is feasible for your use case.
- Time zone: All SAS timestamps need to be in Coordinated Universal Time (UTC). However, Storage Explorer helpfully converts the time if you elect to use your local time zone.
- Container-level URI: Use this to open up an entire container for access instead of a single resource.
Note: I tend to use Uniform Resource Identifier (URI) and Uniform Resource Locator (URL) interchangeably. But technically a SAS address is a URI unless we're speaking of plugging the address into a client application and actually using the URI. The access method (such as HTTPS) makes a URI a URL.
The second screen in the Shared Access Signature dialog box is important because it gives you the actual URL. Copy this long address into the Windows Clipboard. Let me break down each section of a SAS URL (check the documentation as well for additional details):
- https://sales4sysopsdata.blob.core.windows.net/reports/jan2017.csv: This is the URI to the blob in question
- ?st=2018-09-25T12%3A48%3A49Z: Start time that the SAS token becomes valid
- &se=2018-09-25T13%3A48%3A00Z: Expiration time
- &sp=rl: Permission—in this example, we allow List and Read permissions to the blob (this enables our business partner to download the file directly by using her web browser)
- &sv=2018-03-28: Application programming interface (API) version
- &sr=b: Storage account service—in this article, we deal with service SAS URLs, which delegate access to a single Azure storage account service: blob, queue, table, and file
- &sig=xiWghgWBme7et2E%2FANxVhkGmYTVKkg9wVYUw3wi5gf4%3D: Digital signature—changing a single value in a SAS URI invalidates the entire thing; specifically, using one of your two storage account secret keys signs the SAS URI
Sure enough, as long as our business partner accesses the SAS URL within the token's lifetime, she can download the CSV file as shown in the following screenshot.
Additional detail ^
Traditionally, two secret access keys protect Azure storage accounts. It's good security hygiene in Azure to rotate these values periodically. The benefit SAS tokens bring to the table is that you do not have to share either of these highly sensitive keys. What's both convenient and cool is you can use Azure Key Vault to store and access these account keys programmatically.
More recently, Microsoft has introduced Azure Active Directory (Azure AD)-based authentication to Azure storage account resources. However, this latter feature is in preview state, which means it has no service-level agreement (SLA). Thus, you should not implement it for production workloads.
Besides the service SAS tokens we deal with today, you should know that Azure also supports account SAS tokens. Account SAS tokens delegate access across one or more storage services in a storage account. You can also delegate operations within a particular service. I like to keep my Azure resource authorization policies simple; thus, I use service tokens nearly exclusively.
Implementing a stored access policy ^
Let's make our scenario more interesting. Our business partner now needs access to all the blobs inside the reports container. To simplify these permissions, we can do two things: create a stored access policy or grant container-level access.
In Storage Explorer, right-click the reports container and select Manage Access Policies... from the shortcut menu. In the following screenshot, you see I defined two SAS policies: one for 1-hour access and the other for 24-hour access:
We now can define a SAS URI for individual blobs as we did earlier, but this time our work becomes more convenient by being able to select a stored access policy from the drop-down list. Note that deleting a stored access policy immediately invalidates any active SAS tokens that reference that policy.
The other way to go is to right-click the reports container and select Get Shared Access Signature from the context menu. This resulting URL will grant access to all blobs inside the current container.
Using this URL in a browser simply returns XML as shown in the following screenshot. You'll want to use Azure PowerShell or another Azure Storage library to fetch multiple blobs programmatically.
Other management tools ^
Naturally, you can create and manage SAS tokens from within the Azure portal. Navigate to your chosen storage account and select the Shared access signature settings blade. As you can see in the following screenshot, the user interface heavily favors the account-level SAS as opposed to the service-level SAS URI:
Check out the highlight in the previous screenshot; you can provide additional security for SAS URIs by restricting access only from particular IP addresses. This feature isn't available in Azure Storage Explorer, but you can get there with Azure PowerShell:
Subscribe to 4sysops newsletter!
# Don't forget to log into Azure and set your subscription context! # Set up initial variables $StorageAccountName = 'sales4sysopsdata' $resourceGroupName = '4sysops' # Get storage key $storKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $StorageAccountName).Value # Define storage account context $storCont = New-AzureStorageContext -StorageAccountName $StorageAccountName StorageAccountKey $storKey # Create a service SAS token $storSAS = New-AzureStorageAccountSASToken -Service Blob ` -ResourceType Service ` -Permission 'rl' ` -Protocol HttpsOnly ` -Context $storCont ` -IPAddressOrRange 22.214.171.124-126.96.36.199 ` -StartTime (Get-Date).ToUniversalTime() ` -ExpiryTime (Get-Date).ToUniversalTime().AddHours(1) # Display the SAS token (query string only) $storSAS
There you have it! You now know how to use SAS tokens to grant selective access to Azure storage account resources. I'd encourage you to take a closer look at role-based access control (RBAC) authorization to Azure Storage via Azure AD. This is likely the future trend for Azure storage account access, at least for users with Azure AD accounts. As always, keep least-privilege security at the top of your mind, and take good care!