Skip to content

Cloud Azure

PyPI VersionGithub

binaryrain_helper_cloud_azure is a python package that aims to simplify and help with common functions in Azure Cloud areas. It builds on top of the azure library and provides additional functionality to make working with Azure Cloud easier, reduces boilerplate code and provides clear error messages.

To install the package you can use your favorite python package manager:

Terminal window
pip install binaryrain-helper-cloud-azure

azure.functions.HttpResponse

handles returning HTTP responses with status codes and messages:

from binaryrain_helper_cloud_azure.azure import return_http_response
import json
# Return a 200 OK response
return return_http_response('Success Message', 200)
# Return json data with a 201 Created response
return return_http_response(json.dumps({'key': 'value'}), 201)
# Return a 404 Not Found response
return return_http_response('Resource not found', 404)
# Return a 500 Internal Server Error response
return return_http_response('Internal Server Error', 500)
  • message: str | The message to be returned in the response.
  • status_code: int | The status code of the response.

bytes

provides a simplified way to read data from Azure Blob Storage:

from binaryrain_helper_cloud_azure.azure import read_blob_data
# Read a Parquet file from blob storage
df = read_blob_data(
blob_account="your_account",
container_name="your_container",
blob_name="data.parquet"
)
# Read CSV with custom format
df = read_blob_data(
blob_account="your_account",
container_name="your_container",
blob_name="data.csv",
)
# Read with a custom BlobServiceClient
from azure.storage.blob import BlobServiceClient
custom_client = BlobServiceClient(account_url="https://your_account.blob.core.windows.net/")
df = read_blob_data(
blob_account="your_account",
container_name="your_container",
blob_name="data.csv",
blob_service_client=custom_client
)
  • blob_account: str | The name of the blob account. For example, “https://YOUR-ACCOUNT.blob.core.windows.net/
  • container_name: str | The name of the container.
  • blob_name: str | The name of the blob.
  • blob_service_client: BlobServiceClient | None | (Optional) An optional BlobServiceClient instance. If not provided, a new one will be created.

bool

handles uploading data to blob storage:

from binaryrain_helper_cloud_azure.azure import upload_blob_data
# Upload dataframe as Parquet
upload_blob_data(
blob_account="your_account",
container_name="your_container",
blob_name="data.parquet",
file_contents=your_data
)
# Upload a dataframe as CSV
upload_blob_data(
blob_account="your_account",
container_name="your_container",
blob_name="data.csv",
file_contents=bytes(df.to_csv(sep=";", index=False), encoding="utf-8")
)
# Upload with a custom BlobServiceClient
from azure.storage.blob import BlobServiceClient
custom_client = BlobServiceClient(account_url="https://your_account.blob.core.windows.net/")
upload_blob_data(
blob_account="your_account",
container_name="your_container",
blob_name="data.parquet",
file_contents=your_data,
blob_service_client=custom_client
)
  • blob_account: str | The name of the blob account.
  • container_name: str | The name of the container.
  • blob_name: str | The name of the blob.
  • file_contents: bytes | The file contents to be saved.
  • blob_service_client: BlobServiceClient | None | (Optional) An optional BlobServiceClient instance. If not provided, a new one will be created.

dict

Get secret data from Azure Key Vault:

from binaryrain_helper_cloud_azure.azure import get_secret_data
secret = get_secret_data("your_keyvault_url", "your_secret_name")
  • key_vault_url: str | The URL of the Azure Key Vault.
  • secret_name: str | The name of the secret.

str

Create an Azure Data Factory pipeline run:

from binaryrain_helper_cloud_azure.azure import create_adf_pipeline
# Create a data factory run with parameters
params_json = {"param1": "value1"}
pipeline_id = create_adf_pipeline(
subscription_id="your_subscription_id",
resource_group_name="your_resource_group_name",
factory_name="your_adf_name",
pipeline_name="your_pipeline_name",
parameters=params_json,
)
# Create a data factory run with custom credentials
from azure.identity import DefaultAzureCredential
custom_credentials = DefaultAzureCredential()
pipeline_id = create_adf_pipeline(
subscription_id="your_subscription_id",
resource_group_name="your_resource_group_name",
factory_name="your_adf_name",
pipeline_name="your_pipeline_name",
credentials=custom_credentials,
)
  • subscription_id: str | The subscription ID of the Azure account.
  • resource_group_name: str | The name of the resource group.
  • factory_name: str | The name of the Data Factory.
  • pipeline_name: str | The name of the pipeline.
  • parameters: dict | None | (Optional) The parameters to be passed to the pipeline.
  • credentials: DefaultAzureCredential | TokenCredential | None | (Optional) The credentials to be used for authentication. Defaults to None, which will create a new DefaultAzureCredential() instance.
  • adf_base_url: str | (Optional) The base URL of the Azure Data Factory Management API. Defaults to https://management.azure.com

list[str]

List all blob names in a container with optional filtering:

from binaryrain_helper_cloud_azure.azure import list_blob_names_in_container
# List all blobs in a container
blob_names = list_blob_names_in_container(
blob_storage_account="https://your_account.blob.core.windows.net/",
container_name="your_container"
)
# List blobs with a prefix filter
blob_names = list_blob_names_in_container(
blob_storage_account="https://your_account.blob.core.windows.net/",
container_name="your_container",
starts_with="logs/2024/"
)
# List blobs with include options
blob_names = list_blob_names_in_container(
blob_storage_account="https://your_account.blob.core.windows.net/",
container_name="your_container",
include=["metadata", "snapshots"]
)
# List blobs with a custom BlobServiceClient
from azure.storage.blob import BlobServiceClient
custom_client = BlobServiceClient(account_url="https://your_account.blob.core.windows.net/")
blob_names = list_blob_names_in_container(
blob_storage_account="https://your_account.blob.core.windows.net/",
container_name="your_container",
blob_service_client=custom_client
)
  • blob_storage_account: str | The name of the blob storage account.
  • container_name: str | The name of the container.
  • blob_service_client: BlobServiceClient | None | (Optional) An optional BlobServiceClient instance. If not provided, a new one will be created.
  • starts_with: str | None | (Optional) Filter blobs whose names begin with the specified prefix.
  • include: str | list[str] | None | (Optional) Specify one or more additional datasets to include in the response (e.g., “metadata”, “snapshots”, “deleted”).