In the digital world, data comes in different shapes. It could be text, media such as images, or a binary set of zeros and ones, and each kind of these different data types is meant to be stored in a data store designed and built for it, Microsoft Azure storage account service is one of the data store solutions you can use.
If you don’t like reading and you prefer to watch this post in a video format, here it is
Storage account service types
Microsoft Azure provides a storage service that supports different types of data for different scenarios, it can scale easily with high needs and integrates with many technologies, this is the Storage Account service and comes under four kinds of data storage: Blobs, Files, Queues, and Tables.
Imagine that you have a team of 50 developers and you need to share with them some tools to install or files to use, you can use the File Share services from Azure Storage Account to achieve this, think about it as a drive in the network that can be connected to multiple devices at once, or let’s say that you have 10 servers and they share the same configuration files that you want to edit at once, that can be shared across them within the same location in a file share service.
Blob storage may seem like a file share service, but it is used for other purposes, have you wondered one day where Instagram photos or YouTube videos are stored? the blob storage was built to store files for applications, files that we can access using web browsers, and more technically speaking, files that we can access only through the HTTP protocol, so many interesting features that will excite you as a software developer when you build apps that rely on blob storage.
Now that you know the big picture of both blob and file storage, let’s shift our focus to the Queue service. Imagine a magical world where you have a box to leave letters for your friends, and they can pick them up whenever they want. Queue storage works similarly in cloud computing, by holding messages instead of letters. Authorized applications read these messages individually, forming a virtual waiting line to ensure they’re delivered in the right order. This service facilitates asynchronous communication between applications.
For example, let’s consider a photo-sharing app. An application can be built to enhance app performance by compressing uploaded photos, whenever a new photo is added to blob storage, a message is added to the queue with relevant information about the uploaded image file. Later, the app will process this message, demonstrating how Queue storage is used to streamline and optimize tasks in cloud-based applications.
Now, let’s dive into table storage, let’s say you have a huge amount of structured data, like a list of customers and their respective information such as address, contact details, etc. You can use table storage as a database to persist, look through and read the data, or edit it at any time later.
Creating and configuring a storage account
In this example, you will learn how to create a storage account in Microsoft Azure and how to configure it. Go to the Azure Portal (portal.azure.com), use the search box in the navigation bar to find the Storage Accounts list, and then click the Create button.
Within the creation wizard, you will first go through the Basic settings, you will have to specify in which Subscription and which Resource Group your storage account needs to be created
You will also have to specify a unique name for it that should be no longer than 24 characters in length and may contain numbers and lowercase letters only
More settings are demanded at this level such as a main region where the storage account should be located, a performance tier, and a redundancy strategy.
The other settings found in the Advanced, Networking, Data Protection, and Encryption tabs will be discussed later in the next sections of this page. You can click Review and then Create, the deployment will take a few seconds to complete.
Understanding the Storage Account features and configurations is important for better implementation and design for your applications, it will impact several aspects of your projects such as security, costs, and performance. Just keep reading to learn more.
Performance, scaling, and cost models
Azure proposes two different pricing tiers that each meet unique requirements.
The Standard tier is the parent of 3 sub-tiers: Hot, Cool, and Archive. The Hot tier is optimized for frequently accessed data and it has a low latency, unlike the Cool tier, which is designed for infrequently accessed data, offering lower storage costs. The Archive tier on the other hand suits the rarely accessed data, hence it has the lowest storage costs.
The Premium tier is designed for high-performance workloads with heavy-duty needs, it has the lowest latency and is used for critical business applications such as databases.
In addition to the pricing tier, another element added to the costs formula, which is the data redundancy strategy, Microsoft Azure offers six choices
- (LRS) Locally redundant storage: stores three copies of your data within the same data center, ensuring protection against disk and physical server failures.
- (ZRS) Zone redundant storage: stores copies of your data within the same region but in three different data centers ensuring data resiliency across the data centers.
- (GRS) Geo-redundant storage: stores six copies of your data across two primary and secondary regions, this gives protection against regional disasters.
- (GZRS) Geo-zone-redundant storage: stores six copies of your data within a primary and secondary region following the ZRS and LRS models respectively.
- Read-access GRS: Besides GRS features, you can read data from the secondary region when the primary is unavailable.
- Read-access GZRS: same as Read-access GRS but with the GZRS features.
Storage account costs aren’t based only on the region where your data lies, the redundancy strategy, and the pricing tier, it has other variables such as the data volume (how many gigabytes), and the incoming and outgoing data traffic also counts in.
If you’re planning to use Blob storage, you will find this lifecycle management feature very interesting, it allows you to automatically move blob items (files) from one tier to another.
For example, let’s say you’ve uploaded PDF files (customer invoices) and somehow you processed them and then you no longer want to access them frequently but you still want to keep them available, you can move them from the Hot to Cool tier.
You can find more details about costs on the storage account pricing page.
You can estimate your storage account usage costs using the Azure Pricing Calculator.
Securing access to storage account data
Data security is a critical concern, the Storage Account service offers different robust measures of security such as data encryption, network protection, shared access signatures, access keys, and Entra ID (formerly Azure AD).
In addition to the enforced HTTPS access to storage account services, you can have your encrypted whether by encryption keys managed by Microsoft or provided by your team, it is also possible to specify to which networks the access is allowed or to which IP addresses are permitted.
Shared Access Signatures (or SAS) is a mechanism that offers a time-bounded and fine-tuned delegated access to your storage resources, By generating the SAS token, you can define the specific permissions granted, the resources accessible, and the duration of access. This approach minimizes the exposure of your storage account credentials and enhances security when sharing access with third parties or applications.
When you create a storage account, it will have two access keys, they provide full access to your data, and it’s recommended to use them with caution and to have them rotated periodically.
Although access keys can be used to authorize your applications to access and manage your storage account, Microsoft recommends using Entra ID (formerly Azure AD) as a more secure way, because access keys give absolute control over storage accounts, unlike when using Azure AD you can specify and limit the permissions your app needs using both predefined or custom RBAC roles (Role Based Access Control)
Technical Demonstration
In this part of our journey with the Storage Account, you will learn how to build a Node.js application that uploads PDF files to the blob service, this app demo uses the Nest.js framework.
Step 1 – Project Setup
Assuming that node.js and nest CLI are both already installed, let’s bootstrap a new project and install the necessary Azure modules using the following command:
nest new storage-account-demo
cd storage-account-demo
npm install @azure/storage-blob
npm install -D @types/multer
Don’t forget to open the storage-account-demo
directory in your favorite code editor and then start the project using the following command:
npm run start:dev
The application should be accessible on http://localhost:3000, when you open this link you should see a page with “Hello World” text.
Step 2 – Storage account module setup
Nest.js framework uses a modular architecture, it promotes maintainability, reusability, and separation of concerns in applications, let’s build a module for Azure storage accounts
cd src
nest generate module storage-account
nest generate service storage-account
These commands will also generate a service, this is another building block of the Nest.js framework which is responsible for encapsulating business logic. Services are often used to interact with databases or call external APIs.
Using the text editor, go to src/storage-account/storage-account.service.ts
and add the following code
import { Injectable } from '@nestjs/common';
import {
BlobServiceClient,
StorageSharedKeyCredential,
} from '@azure/storage-blob';
@Injectable()
export class StorageAccountService {
private readonly account = '';
private readonly accountKey = '';
private readonly containerName = '';
private blobServiceClient: BlobServiceClient = null;
constructor() {
const sharedKeyCredential = new StorageSharedKeyCredential(
this.account,
this.accountKey,
);
this.blobServiceClient = new BlobServiceClient(
`https://${this.account}.blob.core.windows.net`,
sharedKeyCredential,
);
this.bootstrap();
}
async bootstrap() {
const container = this.blobServiceClient.getContainerClient(
this.containerName,
);
const exists = await container.exists();
if (!exists) {
await container.create();
}
}
async uploadBlob(blob: Express.Multer.File) {
const containerClient = this.blobServiceClient.getContainerClient(
this.containerName,
);
const blobClient = containerClient.getBlockBlobClient(blob.originalname);
await blobClient.uploadData(blob.buffer, {
blobHTTPHeaders: {
blobContentType: blob.mimetype,
},
});
return blobClient.url;
}
}
This class contains 3 objects that should hold information about the storage account name, its access key, and the desired container name where we’ll store our blobs, make sure to update their values with the right ones.
private readonly account = '';
private readonly accountKey = '';
private readonly containerName = '';
The constructor of this class uses theStorageSharedKeyCredential
object as an authentication method to authorize this app to use the storage account using the blob client. There is also a call to the bootstrap function that creates the container if it doesn’t exist
async bootstrap() {
const container = this.blobServiceClient.getContainerClient(
this.containerName,
);
const exists = await container.exists();
if (!exists) {
await container.create();
}
}
Implementing the function that handles uploading the file as a blob to Azure is a piece of cake, it retrieves first the container where the blob will be stored, then generates a client for that blob, and finally triggers the upload before returning its URL
async uploadBlob(blob: Express.Multer.File) {
const containerClient = this.blobServiceClient.getContainerClient(
this.containerName,
);
const blobClient = containerClient.getBlockBlobClient(blob.originalname);
await blobClient.uploadData(blob.buffer, {
blobHTTPHeaders: {
blobContentType: blob.mimetype,
},
});
return blobClient.url;
}
To make this service reusable, we should export it within the module level, edit the storage-account.module.ts
file and add StorageAccountService
to the exports list to the @Module decorator, it should look like that
import { Module } from '@nestjs/common';
import { StorageAccountService } from './storage-account.service';
@Module({
providers: [StorageAccountService],
exports: [StorageAccountService],
})
export class StorageAccountModule {}
The next step is to use the StorageAccountModule
in other parts of our Nest.js application
Step 3 – File upload controller setup
The goal here is to create an HTTP endpoint /upload
that will upload the file it receives as a blob using the storage account service, the part responsible for handling web requests in Nest.js is the controller, find the app.controller.ts
and change its content to the following
import {
Controller,
Post,
UploadedFile,
UseInterceptors,
} from '@nestjs/common';
import { AppService } from './app.service';
import { FileInterceptor } from '@nestjs/platform-express';
@Controller()
export class AppController {
constructor(private readonly appService: AppService) {}
@Post('upload')
@UseInterceptors(FileInterceptor('file'))
async uploadFile(@UploadedFile() file: Express.Multer.File) {
return this.appService.uploadFile(file);
}
}
This controller uses the Post
and UseInterceptors
decorators to handle receiving HTTP requests with a file embedded that will be handled by the uploadFile
function, and it will delegate this action to the appService
class.
Edit the content of app.service.ts
file to the following
import { Injectable } from '@nestjs/common';
import { StorageAccountService } from './storage-account/storage-account.service';
@Injectable()
export class AppService {
constructor(private readonly storageAccountService: StorageAccountService) {}
async uploadFile(file: Express.Multer.File) {
return this.storageAccountService.uploadBlob(file);
}
}
This class will simply ask the storage account service that we built earlier to upload the file as a blob, a reference to that service was defined within the constructor level.
Step 4 – Experimenting
Now that the application is ready to do its job, use any file of your choice from your computer and run the following command. Make sure to replace purchases.pdf
with your actual file name
curl -X POST -F "file=@purchases.pdf" http://localhost:3000/upload
Running this command will upload your file to the Nest.js application, which will upload it too to the blob storage and it will return the file path within the storage account, it should look like that
https://azurehacksprojects.blob.core.windows.net/invoices/purchases.pdf
If you wish to download a full copy of this project, it is available in this repository.
Conclusion
In this exploration of Microsoft Azure Storage, we’ve uncovered a world of possibilities for developers, armed with practical know-how, may your applications soar in the cloud.
Microsoft Azure Storage isn’t just a repository of data, it’s indeed a dynamic ecosystem empowering developers to craft efficient and secure solutions.
Continue exploring Azure’s potential, may your cloud ventures be both transformative and rewarding. Happy coding!
If you have questions about the storage account service, you can use the contact page.