In the last two posts, we’ve set up our local Python environment, installed the necessary Azure libraries, and authenticated to our machine in the cloud. We’ve laid the groundwork for a hybrid cloud lab, but so far, all our work has been on our local machine. Now, we’ll bridge the gap by creating our first remote resource in an Azure Storage Account. Once we create the storage account, we’ll use Python to upload a file from our local machine to that container, demonstrating the “hybridity” of our environment.
Creating the Storage Container
This is an excellent use case for Azure CLI. Instead of logging in to the Azure dashboard and clicking around the GUI, we’ll work from the command line.
First, we have to create a resource group, which is a logical container for your Azure resources. This is a best practice for organizing and managing your projects, allowing you to delete all associated resources with a single command when you’re done.
In your terminal, make sure you’re authenticated to Azure (if you’re not, run az login
or az login --use-device-code
if you’re in WSL). Then, run the following command to create a resource group in the East US region:
az group create --name hybrid-cloud-rg --location eastus
If all goes well, you should seem some JSON output confirming the resource group creation.
Now within the resource group, we’ll create the storage account itself. We’ll give it a unique name (so that no one has a storage account with the same name), and we’ll use the Standard_LRS
SKU for locally redundant storage, which describes our hybrid cloud lab.
az storage account create --name youruniquestorageaccountname --resource-group hybrid-cloud-rg --location eastus --sku Standard_LRS --kind StorageV2
This command can take a couple of minutes to run, so be patient and don’t close the terminal while it’s running. Once complete, we’ll create a container within that storage account. I’ll call mine digital-humanities-project
.
az storage container create --name digital-humanities-project --account-name youruniquestorageaccountname
Uploading a File from Local Machine to Cloud
Now that our remote infrastructure is in place, we can connect our local machine to the cloud instance with our Python script. We’ll use the azure-storage-blog
library we installed in a previous post, which allows us to interact with our new storage account and container.
First, create a text file on your local computer
echo "First file upload!" > first-file.txt
Now, we want to write a Pythons script to upload this file. Create a file called upload.py
and add the following code:
import os
from azure.storage.blob import BlobServiceClient
# Replace with your unique storage account name
storage_account_name = "youruniquestorageaccountname"
container_name = "digital-humanities-project"
local_file_name = "my-first-file.txt"
# Get the connection string from the Azure CLI
print("Getting connection string...")
cmd = f"az storage account show-connection-string --name {storage_account_name}"
connection_string = os.popen(cmd).read().strip()
print("Connection string acquired.")
# Create the BlobServiceClient
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
# Get a client to interact with the container
container_client = blob_service_client.get_container_client(container_name)
# Get a client to interact with the specific blob (file)
blob_client = container_client.get_blob_client(local_file_name)
# Upload the file
print(f"Uploading {local_file_name} to {container_name}...")
with open(local_file_name, "rb") as data:
blob_client.upload_blob(data, overwrite=True)
print("File uploaded successfully!")
The script uses the Azure CLI to get the account’s connection string, which it then uses to authenticate our Python script. It creates a BlobServiceClient
, gets a client for our specific container and file, and then uses the upload_blob
command to push the file to the cloud.
You can run this script in your terminal using:
python upload.py
You should see output indicating that the connection string was acquired and the file was uploaded.
Conclusion
And that’s really it! Using Python and Azure, we’ve created a resource group, a storage container, and we’ve created the Python script to talk to Azure and facilitate the upload from our local system. In this case, we’ve uploaded a single file, but by modifying the script, you could upload dozens or thousands file. With some modifications to the script (or additional scripts), you could also do some of the following:
- List and download blogs using Python to list all the files in your new container, or download them back to your local machine
- Process data in the cloud, uploading a massive dataset from local storage and then using an Azure Function or Virtual Machine to run a data analysis script in the cloud
- Automate backups by creating a script that regularly backs up important local files to your cloud storage account for disaster recovery