Design and Implement an Azure Storage Strategy
- 3/11/2015
Objective 4.3: Implement Azure storage queues
The Azure Storage Queue service provides a mechanism for reliable inter-application messaging to support asynchronous distributed application workflows. This section covers a few fundamental features of the Queue service for adding messages to a queue, processing those messages individually or in a batch, and scaling the service.
Adding messages to a queue
You can access your storage queues and add messages to a queue using many storage browsing tools; however, it is more likely you will add messages programmatically as part of your application workflow.
The following code demonstrates how to add messages to a queue:
string connection = "DefaultEndpointsProtocol=https;AccountName=<ACCOUNTNAME>;AccountKey =<ACCOUNTKEY>"; CloudStorageAccount account; if (!CloudStorageAccount.TryParse(connection, out account)) { throw new Exception("Unable to parse storage account connection string."); } CloudQueueClient queueClient = account.CreateCloudQueueClient(); CloudQueue queue = queueClient.GetQueueReference("workerqueue"); queue.AddMessage(new CloudQueueMessage("Queued message 1")); queue.AddMessage(new CloudQueueMessage("Queued message 2")); queue.AddMessage(new CloudQueueMessage("Queued message 3"));
Processing messages
Messages are typically published by a separate application in the system from the application that listens to the queue and processes messages. As shown in the previous section, you can create a CloudQueue reference and then proceed to call GetMessage() to de-queue the next available message from the queue as follows:
CloudQueueMessage message = queue.GetMessage(new TimeSpan(0, 5, 0)); if (message != null) { string theMessage = message.AsString; // your processing code goes here }
Retrieving a batch of messages
A queue listener can be implemented as single-threaded (processing one message at a time) or multi-threaded (processing messages in a batch on separate threads). You can retrieve up to 32 messages from a queue using the GetMessages() method to process multiple messages in parallel. As discussed in the previous sections, create a CloudQueue reference, and then proceed to call GetMessages(). Specify the number of items to de-queue up to 32 (this number can exceed the number of items in the queue) as follows:
IEnumerable<CloudQueueMessage> batch = queue.GetMessages(10, new TimeSpan(0, 5, 0)); foreach (CloudQueueMessage batchMessage in batch) { Console.WriteLine(batchMessage.AsString); }
Scaling queues
When working with Azure Storage queues, you need to consider a few scalability issues, including the messaging throughput of the queue itself and the design topology for processing messages and scaling out as needed.
Each individual queue has a target of approximately 2,000 messages per second (assuming a message is within 1 KB). You can partition your application to use multiple queues to increase this throughput value.
As for processing messages, it is more cost effective and efficient to pull multiple messages from the queue for processing in parallel on a single compute node; however, this depends on the type of processing and resources required. Scaling out compute nodes to increase processing throughput is usually also required.
As discussed in Chapter 2, “Create and manage virtual machines,” and Chapter 3, “Design and implement cloud services,” you can configure VMs or cloud services to auto-scale by queue. You can specify the average number of messages to be processed per instance, and the auto-scale algorithm will queue to run scale actions to increase or decrease available instances accordingly.
Objective summary
- Applications can add messages to a queue programmatically using the .NET Storage Client Library or equivalent for other languages, or you can directly call the Storage API.
- Messages are stored in a storage queue for up to seven days based on the expiry setting for the message. Message expiry can be modified while the message is in the queue.
- An application can retrieve messages from a queue in batch to increase throughput and process messages in parallel.
- Each queue has a target of approximately 2,000 messages per second. You can increase this throughput by partitioning messages across multiple queues.
Objective review
Answer the following questions to test your knowledge of the information in this objective. You can find the answers to these questions and explanations of why each answer choice is correct or incorrect in the “Answers” section at the end of this chapter.
Which of the following statements are true about queuing messages? (Choose all that apply.)
- Storage queue messages have no size restrictions. The reason for using smaller messages sizes is to increase throughput to the queue.
- Storage queue messages are limited to 64 KB.
- Storage queue messages are durable.
- The client application should save the message identifier returned after adding a message to a queue for later use.
Which of the following are valid options for processing queue messages? (Choose all that apply.)
- A single compute instance can process only one message at a time.
- A single compute instance can process up to 32 messages at a time.
- A single compute instance can retrieve up to 32 messages at a time.
- Messages can be read one at a time or in batches of up to 32 messages at a time.
- Messages are deleted as soon as they are read.
Which of the following are valid options for scaling queues? (Choose all that apply.)
- Distributing messages across multiple queues
- Automatically scaling websites based on queue metrics
- Automatically scaling VMs based on queue metrics
- Automatically scaling cloud services based on queue metrics