Azure Blob Storage
This guide explains how to configure Azure Blob Storage credentials for use with the Data Factory HTTP task using SAS (Shared Access Signature) tokens.
Overview
Azure Blob Storage uses different authentication methods than AWS S3. The HTTP task supports Azure through SAS Tokens, which are appended to the request URL as query parameters.
Authentication Methods
| Method | Supported | Description |
|---|---|---|
| SAS Token | Yes | Shared Access Signature appended to URL |
| Shared Key | No | Requires custom signature calculation |
| Azure AD | Partial | Via OAuth2 Client Credentials |
Supported Operations
- List blobs in a container
- Upload blobs (PUT)
- Download blobs (GET)
- Delete blobs (DELETE)
- Get blob properties (HEAD)
Prerequisites
- An Azure account
- Access to the Azure Portal
- A Storage Account created
- Product-Live account with access to the Data Factory platform
Step 1: Create a Storage Account
- Sign in to the Azure Portal
- Navigate to Storage accounts
- Click + Create
- Fill in the details:
- Subscription: Select your subscription
- Resource group: Select or create one
- Storage account name:
productlivedatafactory(lowercase, no special characters) - Region: Select closest to your users
- Performance: Standard
- Redundancy: LRS (or as needed)
- Click Review + create → Create
Step 2: Create a Container
- Open your Storage Account
- Navigate to Data storage → Containers
- Click + Container
- Enter a Name (e.g.,
data-factory) - Set Public access level to Private
- Click Create
Step 3: Generate a SAS Token
Option A: Account-level SAS (Recommended for multiple containers)
- Open your Storage Account
- Navigate to Security + networking → Shared access signature
- Configure the SAS:
- Allowed services: Blob
- Allowed resource types: Container, Object
- Allowed permissions: Read, Write, Delete, List (as needed)
- Start and expiry date/time: Set appropriate validity period
- Allowed protocols: HTTPS only
- Click Generate SAS and connection string
- Copy the SAS token (starts with
?sv=...)
Option B: Container-level SAS (More restrictive)
- Navigate to your container
- Click ... → Generate SAS
- Configure permissions and expiry
- Click Generate SAS token and URL
- Copy the SAS token
Important
Save the SAS token securely! It provides access to your storage. Set an appropriate expiry date and rotate tokens regularly.
Step 4: Note Your Endpoint
Azure Blob Storage uses the following URL format:
https://{storage-account}.blob.core.windows.net/{container}/{blob-path}?{sas-token}
Example components:
- Storage Account:
mystorageaccount - Container:
data-factory - Endpoint:
mystorageaccount.blob.core.windows.net
Step 5: Configure Data Factory Variables
Create the following variables in your Data Factory project:
| Variable Name | Description | Example |
|---|---|---|
azure_storage_account | Storage account name | mystorageaccount |
azure_container | Container name | data-factory |
azure_sas_token | SAS token (including ?) | ?sv=2021-06-08&ss=b&srt=co&sp=rwdlac... |
Security Best Practice
Mark the azure_sas_token variable as a secret in Data Factory to prevent it from being displayed in logs.
Step 6: Use in HTTP Task
Important: SAS Token in Path
The SAS token must be appended to the path parameter, not as a separate query parameter.
Example configuration for listing blobs in a container:
json
{
"name": "protocol-http-perform",
"taskReferenceName": "list-azure-blobs",
"type": "SUB_WORKFLOW",
"inputParameters": {
"scheme": "HTTPS",
"method": "GET",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}${workflow.variables.azure_sas_token}&restype=container&comp=list",
"headers": {
"x-ms-version": "2021-06-08"
},
"responses": ["STRING"],
"connectionTimeOutMilliseconds": 30000
}
}1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
Common Azure Blob Storage Operations
List Blobs in a Container
json
{
"scheme": "HTTPS",
"method": "GET",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}${workflow.variables.azure_sas_token}&restype=container&comp=list",
"headers": {
"x-ms-version": "2021-06-08"
},
"responses": ["STRING"]
}1
2
3
4
5
6
7
8
9
10
2
3
4
5
6
7
8
9
10
Upload a Text File
json
{
"scheme": "HTTPS",
"method": "PUT",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}/my-folder/my-file.txt${workflow.variables.azure_sas_token}",
"headers": {
"x-ms-version": "2021-06-08",
"x-ms-blob-type": "BlockBlob",
"Content-Type": "text/plain"
},
"body": {
"type": "PLAIN",
"contentType": "text/plain",
"text": "Hello from Product-Live Data Factory!"
},
"responses": ["STRING"]
}1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
Upload a Binary File
json
{
"scheme": "HTTPS",
"method": "PUT",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}/my-folder/image.png${workflow.variables.azure_sas_token}",
"headers": {
"x-ms-version": "2021-06-08",
"x-ms-blob-type": "BlockBlob",
"Content-Type": "image/png"
},
"body": {
"type": "BINARY_FILE",
"file": {
"url": "${previous_task.output.file.url}"
}
},
"responses": ["STRING"]
}1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
Download a File
json
{
"scheme": "HTTPS",
"method": "GET",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}/my-folder/my-file.txt${workflow.variables.azure_sas_token}",
"headers": {
"x-ms-version": "2021-06-08"
},
"responses": ["FILE"]
}1
2
3
4
5
6
7
8
9
10
2
3
4
5
6
7
8
9
10
Delete a File
json
{
"scheme": "HTTPS",
"method": "DELETE",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}/my-folder/my-file.txt${workflow.variables.azure_sas_token}",
"headers": {
"x-ms-version": "2021-06-08"
}
}1
2
3
4
5
6
7
8
9
2
3
4
5
6
7
8
9
Get Blob Properties (HEAD)
json
{
"scheme": "HTTPS",
"method": "HEAD",
"domain": "${workflow.variables.azure_storage_account}.blob.core.windows.net",
"path": "/${workflow.variables.azure_container}/my-folder/my-file.txt${workflow.variables.azure_sas_token}",
"headers": {
"x-ms-version": "2021-06-08"
},
"responses": ["HEADERS"]
}1
2
3
4
5
6
7
8
9
10
2
3
4
5
6
7
8
9
10
Azure Regions Reference
| Region Code | Location |
|---|---|
westeurope | Netherlands |
northeurope | Ireland |
francecentral | Paris, France |
francesouth | Marseille, France |
germanywestcentral | Frankfurt, Germany |
uksouth | London, UK |
eastus | Virginia, USA |
westus | California, USA |
For a complete list, see Azure Regions.
Required Headers
Azure Blob Storage requires specific headers:
| Header | Required | Description |
|---|---|---|
x-ms-version | Yes | API version (e.g., 2021-06-08) |
x-ms-blob-type | For PUT | Blob type: BlockBlob, AppendBlob, or PageBlob |
Content-Type | For PUT | MIME type of the content |
SAS Token Permissions Reference
| Permission | Code | Description |
|---|---|---|
| Read | r | Read blob content and properties |
| Write | w | Create or modify blob content |
| Delete | d | Delete blobs |
| List | l | List blobs in a container |
| Add | a | Add blocks to append blobs |
| Create | c | Create new blobs |
Example SAS token with read, write, delete, and list:
?sv=2021-06-08&ss=b&srt=co&sp=rwdl&se=2026-12-31T23:59:59Z&st=2026-01-01T00:00:00Z&spr=https&sig=...
Troubleshooting
Error: "AuthenticationFailed"
- The SAS token has expired - generate a new one
- The SAS token doesn't have the required permissions
- The storage account or container name is incorrect
Error: "AuthorizationPermissionMismatch"
- The SAS token doesn't include the required permission
- Check the
spparameter in the SAS token
Error: "BlobNotFound"
- The blob path is incorrect
- The container name is case-sensitive
Error: "InvalidHeaderValue" for x-ms-version
- The API version is not supported
- Use a recent version like
2021-06-08or2023-11-03
Error: "UnsupportedHeader"
- Remove any unsupported headers
- Azure has specific header requirements
Comparison with AWS S3
| Feature | AWS S3 | Azure Blob Storage |
|---|---|---|
| Authentication | Signature V4 | SAS Token |
| URL Format | {bucket}.s3.{region}.amazonaws.com | {account}.blob.core.windows.net/{container} |
| Required Headers | Minimal | x-ms-version, x-ms-blob-type |
| List Operation | Query params | Query params + restype=container&comp=list |
Security Best Practices
- Use short-lived SAS tokens - Set expiry dates appropriately
- Limit permissions - Only grant required permissions (r, w, d, l)
- Use HTTPS only - Set
spr=httpsin SAS token - Restrict IP addresses - Use
sipparameter if possible - Rotate tokens regularly - Generate new SAS tokens periodically
- Never commit tokens - Use Data Factory secrets
- Use container-level SAS - More restrictive than account-level
- Enable logging - Monitor access via Azure Storage Analytics