Durable queue in Azure Service Fabric using WebJobs, Part 1: Introduction to Asynchronous Operations

This the first post in a series about Durable queue in Azure Service Fabric using WebJobs:

  • Part 1: Introduction to Asynchronous Operations
  • Part 2: Web API

The built-in into Azure Service Fabric (shortly ASF, or just SF) actor model and its queuing/dispatching mechanism has one, besides other if any, serious disadvantage: if an actor is created to process each incoming HTTP request (for given id), since actors by design are single-threaded, then all subsequent requests won’t be processed until the actor is done with the first one. Then again only one request is picked-up for processing, all other are basically waiting in a simple in-memory queue. And so on. What is unacceptable under heavy load because all incoming HTTP requests are blocked and waiting so are the users in browsers behind them.

Before switching to ASF, I was successfully using Azure Web Apps with WebJobs in the back to off-load heavy on computing HTTP requests which would be otherwise dropped by timeout. The workflow works as follows:

  • Web API accepts HTTP request, converts it into queue message payload, enqueues this message into Azure Queue, returns status code 202 Accepted which means that the request was not processed yet but rather considered for future processing. According to RFC 7231 – HTTP/1.1 Semantics and Content, RESTful Cookbook, or Microsoft docs for HDInsight, response should also include the Location header which client can follow to check the status of the operation.
  • WebJobs host polls the queue over configurable period of time and notifies a so called queue trigger which from the SDK consumer perspective looks like an asynchronous method invocation. One of the advtages is that the message payload can be read as either a string or a custom strongly-typed model class.
  • If trigger succeeded (didn’t throw exception) then message is considered processed and disappears from the queue.
  • Otherwise trigger is invoked again. If unsuccessful invocation threshold has been reached then message becomes poison and is put into a separate, so called poison message queue which can be monitored as any other, regular queue.

To enable the same workflow on the cluster, start the WebJobs host within a stateless service:

namespace DurableQueueSample.StatelessWebJobsService
{
  internal sealed class StatelessWebJobsService : StatelessService
  {
    public StatelessWebJobsService(StatelessServiceContext context)
      : base(context)
    {
    }
 
    protected override async Task RunAsync(CancellationToken cancellationToken)
    {
      try
      {
        var config = new JobHostConfiguration
        {
          DashboardConnectionString = connectionString,
          StorageConnectionString = connectionString,
          Queues =
          {
            BatchSize = 5,
            MaxDequeueCoun = 3,
            MaxPollingInterval = TimeSpan.FromSeconds(30)
          }
        };
        var host = new JobHost(config);
        await host.StartAsync(cancellationToken);
      }
      catch (Exception ex)
      {
        ServiceEventSource.Current.ServiceStartupFailedEvent(ex.ToString());
        throw;
      }
    }
  }
}

I omitted host configuration for brevity, you can find more in its source code, luckily it’s fully open sourced. I usually set the following properties instantiated by a Dependency Injection (DI) container such as Simple Injector:

  • DashboardConnectionString
  • StorageConnectionString
  • JobActivator
  • NameResolver
  • TypeLocator

Next time: we’ll create another stateless service that will host the Web API.

This entry was posted in Programming and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.