Shared access signature (SAS) tokens let you grant restricted access to storage account resources in Microsoft Azure.

Let's imagine you manage your company's Microsoft Azure subscription. You receive a request from an important business partner who needs to see a specific sales data file named jan2017.csv stored in the binary large object (blob) service of a storage account named sales4sysopsdata.

Corporate security policy prevents you from sending a copy of the requested file via email file attachment or FTPS. How can you solve the problem?

Today I will teach you how to use shared access signature (SAS) tokens to provide time-restricted access to blob resources in Azure storage accounts.

Setting the stage ^

In today's exercise, we will use Microsoft's free Azure Storage Explorer desktop application to grant our business partner her desired level of access to that sales file. Go ahead and install Storage Explorer, start the application, and authenticate to your subscription.

As you can see in the following screenshot, the jan2017.csv file is in a container named reports in the sales4sysopsdata storage account.

Azure Storage Explorer

Azure Storage Explorer

Creating your first SAS URL ^

In Storage Explorer, right-click jan2017.csv and select Get Shared Access Signature… from the context menu. The Shared Access Signature form includes the following fields:

  • Access policy: A stored access policy is a way to manage multiple SAS tokens in the same container. We'll deal with this option later in today's tutorial.
  • Start time: SAS URLs are time-limited; the best practice is to set the start time a few minutes before the current time to work around latency issues.
  • Expiry time: Because a SAS URL is usable by anybody on Earth with that address, make the expiration time as short as is feasible for your use case.
  • Time zone: All SAS timestamps need to be in Coordinated Universal Time (UTC). However, Storage Explorer helpfully converts the time if you elect to use your local time zone.
  • Container-level URI: Use this to open up an entire container for access instead of a single resource.

Note: I tend to use Uniform Resource Identifier (URI) and Uniform Resource Locator (URL) interchangeably. But technically a SAS address is a URI unless we're speaking of plugging the address into a client application and actually using the URI. The access method (such as HTTPS) makes a URI a URL.

Creating a SAS token

Creating a SAS token

The second screen in the Shared Access Signature dialog box is important because it gives you the actual URL. Copy this long address into the Windows Clipboard. Let me break down each section of a SAS URL (check the documentation as well for additional details):

  • This is the URI to the blob in question
  • ?st=2018-09-25T12%3A48%3A49Z: Start time that the SAS token becomes valid
  • &se=2018-09-25T13%3A48%3A00Z: Expiration time
  • &sp=rl: Permission—in this example, we allow List and Read permissions to the blob (this enables our business partner to download the file directly by using her web browser)
  • &sv=2018-03-28: Application programming interface (API) version
  • &sr=b: Storage account service—in this article, we deal with service SAS URLs, which delegate access to a single Azure storage account service: blob, queue, table, and file
  • &sig=xiWghgWBme7et2E%2FANxVhkGmYTVKkg9wVYUw3wi5gf4%3D: Digital signature—changing a single value in a SAS URI invalidates the entire thing; specifically, using one of your two storage account secret keys signs the SAS URI

Sure enough, as long as our business partner accesses the SAS URL within the token's lifetime, she can download the CSV file as shown in the following screenshot.

Accessing a SAS protected Azure storage resource

Accessing a SAS protected Azure storage resource

Additional detail ^

Traditionally, two secret access keys protect Azure storage accounts. It's good security hygiene in Azure to rotate these values periodically. The benefit SAS tokens bring to the table is that you do not have to share either of these highly sensitive keys. What's both convenient and cool is you can use Azure Key Vault to store and access these account keys programmatically.

More recently, Microsoft has introduced Azure Active Directory (Azure AD)-based authentication to Azure storage account resources. However, this latter feature is in preview state, which means it has no service-level agreement (SLA). Thus, you should not implement it for production workloads.

Besides the service SAS tokens we deal with today, you should know that Azure also supports account SAS tokens. Account SAS tokens delegate access across one or more storage services in a storage account. You can also delegate operations within a particular service. I like to keep my Azure resource authorization policies simple; thus, I use service tokens nearly exclusively.

Implementing a stored access policy ^

Let's make our scenario more interesting. Our business partner now needs access to all the blobs inside the reports container. To simplify these permissions, we can do two things: create a stored access policy or grant container-level access.

In Storage Explorer, right-click the reports container and select Manage Access Policies... from the shortcut menu. In the following screenshot, you see I defined two SAS policies: one for 1-hour access and the other for 24-hour access:

Creating stored access policies

Creating stored access policies

We now can define a SAS URI for individual blobs as we did earlier, but this time our work becomes more convenient by being able to select a stored access policy from the drop-down list. Note that deleting a stored access policy immediately invalidates any active SAS tokens that reference that policy.

Accessing stored access policies

Accessing stored access policies

The other way to go is to right-click the reports container and select Get Shared Access Signature from the context menu. This resulting URL will grant access to all blobs inside the current container.

Using this URL in a browser simply returns XML as shown in the following screenshot. You'll want to use Azure PowerShell or another Azure Storage library to fetch multiple blobs programmatically.

Listing all container blobs

Listing all container blobs

Other management tools ^

Naturally, you can create and manage SAS tokens from within the Azure portal. Navigate to your chosen storage account and select the Shared access signature settings blade. As you can see in the following screenshot, the user interface heavily favors the account-level SAS as opposed to the service-level SAS URI:

Creating a SAS token in the Azure portal

Creating a SAS token in the Azure portal

Check out the highlight in the previous screenshot; you can provide additional security for SAS URIs by restricting access only from particular IP addresses. This feature isn't available in Azure Storage Explorer, but you can get there with Azure PowerShell:

Subscribe to 4sysops newsletter!

# Don't forget to log into Azure and set your subscription context!
# Set up initial variables
$StorageAccountName = 'sales4sysopsdata'
$resourceGroupName = '4sysops'

# Get storage key
$storKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroupName -Name $StorageAccountName).Value[0]

# Define storage account context
$storCont = New-AzureStorageContext -StorageAccountName $StorageAccountName  StorageAccountKey $storKey

# Create a service SAS token
$storSAS = New-AzureStorageAccountSASToken -Service Blob `
    -ResourceType Service `
    -Permission 'rl' `
    -Protocol HttpsOnly `
    -Context $storCont `
    -IPAddressOrRange `
    -StartTime (Get-Date).ToUniversalTime() `
    -ExpiryTime (Get-Date).ToUniversalTime().AddHours(1)

# Display the SAS token (query string only)

Wrap-up ^

There you have it! You now know how to use SAS tokens to grant selective access to Azure storage account resources. I'd encourage you to take a closer look at role-based access control (RBAC) authorization to Azure Storage via Azure AD. This is likely the future trend for Azure storage account access, at least for users with Azure AD accounts. As always, keep least-privilege security at the top of your mind, and take good care!

  1. Mauro 3 years ago


    I have used azure storage with SAS keys and powershell to import PST in office 365 mailboxes, imho was more simple and straightforward than using the office 365 network upload service - no need to use azcopy tool and prepare the .csv mapping file.

    If someone is interested i can share the code


  2. Saurav Chakraborty 2 years ago


    I was trying to create a SAS token based on Storage Policy via Azure Storage Explorer. I have appended the query parameter - &si=<StoragePolicyName> to the URL that is generated after clicking on the 'Create' button in the 'Generate Shared Access Signature dialog' box. It does not seem to work. Adhoc SAS Tokens are working though.  Could you provide some guidance on this,

    Thanks and Regards

    Saurav Chakraborty

    Plural Sight Follower


    • Hi Saurav,

      I just tested the SAS token with Storage Explorer using both the methods(Policy, Adhoc). It is working fine for me. Could you please try generating the token for some public/test data and share the query string/URL here. That will help us in identifying the issue.


  3. Saurav Chakraborty 2 years ago

    Hi Swapnil, 

    Thanks for the response. I tried to generate the SAS token from the portal by referring o the policy from my personal account using Storage Explorer in the portal. It just hangs after clicking the 'Create' button.

    From our enterprise account , I generated the token using Storage Explorer in the portal, but the URL always fails with an invalid signature error.  Unfortunately, I am not able to share the URLs from our enterprise account.

    In the end, I  used Azure CLI to create a policy and generate a SAS Token. This seems to work.

    Thanks and Regards

    Saurav Chakraborty




    • Hi Saurav,

      Thank you. Ever since the beginning, there has always been a feature imparity between Azure Storage Management tools. IMO Azure CLI is the best tool to manage Azure Storage. Next would be the Azure portal, Azure Rest API and Storage Explorer would be the last option for me.


  4. Payal Goel 2 years ago

    Hi Saurav,

    I am creating the Stored Access policy programmatically and the SAS also. the question is I want to update the expiry time based upon few conditions, for this scenario I need to fetch the existing policy on a container. 

    Could you please help me with this approach.

    Thanks & Regards,

    Payal Goel


  5. Saurav Chakraborty 2 years ago
  6. Sumana 1 year ago

    Can I create a SAS url to access all files and folders of a directory in Azure File share (i.e. not Blobs) in C# by using rest API?


Leave a reply

Please enclose code in pre tags

Your email address will not be published. Required fields are marked *


© 4sysops 2006 - 2021


Please ask IT administration questions in the forums. Any other messages are welcome.


Log in with your credentials


Forgot your details?

Create Account