How, when, and why to use Microsoft Azure Cloud Services Azure Functions

Benefit from Azure’s cloud-based, serverless computing

Serverless computing is a modern paradigm for the deployment of cloud-based applications. Previously, developers had to take care of too many things, such as updating the operation system, cleaning up the system disk, maintaining the whole environment, and even rebooting the system periodically. A web server and other required software for web application hosting installed, and only then developer was able to host an application and show it to the whole world.

VM - cloud diagram

Today, developers using serverless computing can enjoy cost savings, as they are only charged for the time that the function app runs. Furthermore, they’re experience scalability (e.g., Azure functions scales automatically) with no need to have a high level of cloud computing expertise that is time-consuming to acquire.

This blog focuses on Azure cloud-based applications, specifically Azure Functions and Azure Durable Functions. In addition to providing context to help understand Azure Functions, I’ll go over how to extend Azure Functions with Durable Functions, when to use Azure Functions instead of a Web API, best practices for both Azure Functions and Durable Functions, tactics for avoiding pitfalls, and use cases.

The evolution of Azure Cloud Services

When I started learning the system, I noticed that its deep experience and mature ecosystem offered services made my life much easier. The services supported many different programming languages, tools, and frameworks. Azure provided virtual machines (VM) or infrastructure as a service (IAAS) that was very close to dedicated servers, where the developer was responsible for managing almost everything in the IAAS world.

Azure improved further by adding a cloud technology platform as a service (PAAS) to support web applications. The Cloud Service was ready to deploy the application without any additional developer effort. Moreover, the platform had OS and security patch update system that handled updates automatically. While greatly improved, these updates were not without risk. For example, the developer still had access to customize the underlying virtual machine, which could cause accidental mistakes or system breakdown.

In 2015, a new Azure App Service was presented. The service took care of application deployment and management, while the developer only needed to concentrate on app development using a favorite programming language. Additional functions were added, such as autoscaling, load balancing, monitoring, health checks, security, and other powerful features. It looked great when you needed to develop and maintain the whole application with a middleware, complex configuration, and controller.

If you only needed a short-lived process for small action or tiny integration? Well, in this case, Azure Functions became your best friend!

Understand Azure Functions

Azure Functions is a Microsoft Azure service that lets developers run small pieces of code and implement an event-driven (or message-based) system. When using Azure Functions, you can only focus on writing code without worrying about infrastructure management like web server configuration or operating system security updates. Moreover, you do not need to worry about scaling because it has auto-scale based on a workload. In other words, this serverless solution can automatically create additional instances on high load and ensure that your web application won’t go down if there are more visitors than expected.

Another benefit of Azure Functions is that the billing model is based on resource consumption per second and execution (if you use a Consumption plan), meaning that you only pay for the resources that you utilize. For example, when some system event (or signal) is received, Azure Cloud Services allocates resources, wakes up an Azure Functions for some specific job, and then the function goes to sleep. Sometimes you can pay even less because the Consumption plan grants 1 million requests per month for free.

Azure Functions is a single-purposed, reusable piece of code that does not have any state from the architect’s perspective. Each function is designed to handle predefined events and can be cloned with the blink of an eye as many times as needed.

Referring to technical characteristics, Azure Functions supports most of the popular programming languages: C#, F#, JavaScript, TypeScript, Java, Python, and PowerShell and can be triggered using specific events, such as:

  • An Azure Functions instance with a connected HTTP web trigger can act as a Web API method, process a request, and provide some data as a response.

  • Timer trigger can run an Azure Functions to execute some job periodically.

  • A bunch of Azure Functions connected to specific queue triggers can be part of the message-based system.

All these cases involve executing a rapid burst of stateless custom code at scale. Most of them have a short duration, code has no state, and these characteristics are essential in developing short-lived, scalable, and low-cost-optimized applications.

What happens when you need to implement more complex logic like running several parallel functions or having one-by-one execution and aggregating results into one object? What if a system needs a long-lived mechanism to handle a human interaction or an external system event? Apparently, simple Azure Functions cannot cover these cases because of technical limitations, and Durable Functions should be used.

Azure Durable Functions: What they are and how to use them

Durable Functions is a framework that extends Azure Functions. The extension has three main parts:

  1. The starter function is a part that initiates an orchestration function and uses the same triggers that simple Azure Functions does.

  2. Activity functions are basic units of work/tasks orchestrated during the process.

  3. The orchestration function coordinates activity functions execution, calls activity functions sequentially (or in parallel), saves their return values in local variables, aggregates, and returns a result.

Azure Functions Runtime

There is a start function with an HTTP trigger shown in the picture above that starts an orchestration function using very simple logic with three functions doing some work one by one. I was debugging a very similar orchestration function when I noticed a very strange thing: the code execution process stopped on the first activity function call. After the activity function completed orchestration, the function started from the very beginning.

It turns out that an orchestration function does not wait for the activity function to finish and completes an execution every time it starts any activity function. Azure Functions runtime gets the activity function results and puts them into a storage account linked to the durable function.

Pay attention when Event Sourcing is used to update the function state. It is a great way to save the state when data is persisted by storing a sequence of state-changing events. A new event is appended to the sequence of inner function events whenever an activity function (or another object) gets new information. When the state is saved, Azure Functions runtime starts the orchestration function again and reconstructs a function state by replaying its events.

The second start orchestration function gets the first activity function execution result and proceeds with the second activity function execution. The function performs the job and leaves logs (shown in the image above).

It’s important to note Azure Functions constraints or code limitations. There are several restrictions for an orchestration function implementation, such as:

  • An orchestration code must be deterministic, meaning that the code should produce the same result each time. A random number generation and the current date (the returned value is different for each replay) cannot be used. In case of date usage, DurableOrchestrationContext.CurrentUtcDateTime should be used.

  • An orchestration function must not directly read from a file or database and should not call async API. All these actions should be placed inside activity functions.

  • Do not use environment or static variables because these values can change over time. Use constants or move static or environment variables usage into activity functions instead.

  • Do not write an infinite loop because infinite loop execution goes to execution history and takes all the memory soon. Use DurableOrchestrationContext.ContinueAsNew() alternative for infinite loop scenarios.

  • The framework has preceding rules violation detection and throws a NonDeterministicOrchestrationException exception when it finds it, but unfortunately, this mechanism is not reliable. Sometimes it cannot detect violations or throws the exception because of other reasons.

Why use Azure Functions instead of Web API?

Web API has a different load on the different controllers and methods. Very few need to be called or scaled out. In this case, the whole Azure Application Service can be scaled in (e.g., to increase RAM size or add a CPU core) or scaled out (e.g., to make more service instances). Beware that these actions are not the most cost-efficient.

In case of high load on Azure Functions, each function scales out separately. Azure Function Runtime creates more function instances just to carry out the current job and release resources after that. In terms of cost-saving, it is the best solution.

Using Azure Functions could be more convenient to address rare or periodic actions for a few hours a day. When there is no load, the bill tends to be very small.

From the functionality perspective, Azure Functions requests have the same headers, authentication ways, and body. Due to its stateless nature, it does not have any cookies or a session that should prevent you from creating Restful API.

Azure Functions supports the dependency injection design pattern. When registering services, you need to add a Startup file (very similar to the WebAPI project) and configure all components in one place.

Despite the benefits, I would not recommend using Azure Functions as a Web UI backend when Azure Functions uses the Consumption plan. A function cold start can considerably increase response time and make your application slower (this topic will be described later in the Pitfalls section). Keep in mind that Azure Functions is not a silver bullet. It is not a one-to-one replacement for all Web APIs but can be a better solution in specific instances.

Address common Azure Functions issues

Nothing’s perfect. So is the case for Azure Functions. There are some problems that persist. The section below covers three common issues along with solutions to address them.

The issue: Cold starts
When an Azure Functions instance is not invoked for a considerable time (about 20 minutes), it takes much longer to respond. The duration of such a cold start delay depends on several infrastructure parameters, such as programming language, package size, and operation system. It might be an unpleasant surprise when you develop a UI backend part, and the end-user has a bad experience because of occasionally poor performance.

You can avoid cold starts by:

  1. Using a time trigger to keep Azure Functions instance constantly warm: A periodic function evaluation will not let Azure runtime dispose of the instance and release resources. It seems a little bit hacky and probably costs more because of additional (often but short) executions. A smarter way would be to warm up your function just before you expect the next action (for example, when a user visits some specific webpage and it is possible that function will be triggered soon).

  2. Changing “Consumption plan" to "App service plan": You will have a dedicated computing resource. Please pay attention that in this case, you lose the flexible consumption pay model and get a fixed monthly fee per server instance. Moreover, auto-scale based on a workload does not work with App service plan.

  3. Including dedicated instances of Azure Functions Premium plan: The solution has an auto-scale feature with no cold start.

The issue: “Non-Deterministic workflow detected” error messages
As discussed above, an orchestration function should call only deterministic APIs. However, even if your function does not have any restricted actions, you can get a misleading "Non-Deterministic workflow detected" message. This error message occurred in one of my projects and took a lot of time to figure out what was going wrong. The long-running orchestration function (2-3 hours) occasionally failed with the "Non-Deterministic workflow detected” error.

After a long investigation and communication with Azure Functions developers, we found that orchestration functions in different environments had the same TaskHub name. As a result, Azure Functions inner service data got mixed up and failed. Be sure to pay attention to Azure Functions configuration parameters. Do not allow the use of the same names in different environments. Otherwise, you will need to confront misleading error messages and unnecessary problems that take a lot of effort and money to resolve.

The issue: Fails on an orchestration of activity function changes
By its nature, Durable Functions have a lot of moving parts, and some changes can temporarily break in-flight instances. Any activity function signature change or an additional activity function call in an orchestration function can break the evaluation.

The difference between execution history and new function implementation does not let the execution continue. However, the issue fades away on the next execution, but such fails can be an unpleasant surprise in a mission-critical application. To handle a breaking change, you should wait until the orchestration function ends or uses side-by-side development (update the task hub name on deployment).

6 best practices for Azure Functions and Durable Functions

This section provides guidance on how to improve debug experience and reliability of Azure Function and Durable Functions applications. The following are best practices and general rules that developers can use to design better serverless solutions using Azure Functions.

  1. Every time an Azure Function instance starts, it creates a new version of itself, making it difficult to collect data. Debugging an orchestration function can be cumbersome because of many replays. Try using a conditional breakpoint where context.IsReplaying is false to stop only on non-replay executions. You should use this condition for logging to exclude redundant logging messages. Moreover, to have short and clean code, create a ReplaySAzure FunctionseLogger that automatically filters out repetitive (because of replays) logs.

  2. Split large functions into smaller function sets. The Azure runtime could execute smaller functions in parallel more efficiently because of automatic auto-scale and based on the workload. At the same time, you can have a simpler unit test implementation.

  3. Use an “Async all the way” approach, meaning that all methods in an execution chain should be async. If just one method is not async and uses Task.Result or Task.Wait, it can lead to deadlocks. C# async programming is usually a bad idea to block async execution for Task.Result or Task.Wait methods and can lead to deadlocks in some cases. However, the Azure Functions framework has different problems where these methods temporarily block code execution and threads. In the case of many concurrent executions may be a cause of thread exhaustion problem.

  4. Each Durable Function application generates a lot of inner read/write operations: storing execution history, managing triggers, using storage account queues for communication between functions, and saving technical logs. It is reasonable to have a separate storage account for each serverless application. For best performance, create a storage account in the same region.

  5. Collect logs, performance, and error data by using a built-in Azure Application Insight integration. Despite the fact that Azure is cloud-based, you can use the tool in your local development environment to diagnose issues easier and better understand how your functions work. You can also configure the log level, have custom telemetry data, and review the data as tables or charts.

  6. Use Azure Functions – Azure Key Vault integration to keep secrets (connection string, keys, or passwords) to access external services. Use a reference to an Azure Key Vault value from Azure Functions app settings (or programmatically retrieve those values from code) to use these secrets in a secure way.

Create flexible, innovative applications

Azure Function framework is a mature Microsoft product that allows you to create more flexible applications so you can innovate and respond to change faster. There are several practical use cases for how to use serverless solutions in real life.

Azure Functions with an HTTP trigger acts as a Web API application: In case of different load for each endpoint, AZURE FUNCTIONS Runtime automatically scales each function. Endpoint with high load gets more function instances and, at the same time, endpoints with low load have only one instance.

Microservices for rare use can be replaced with an Azure Functions instance. In this case, the dedicated instance fixed price is replaced by a billing model based on a per-second resource consumption model. Moreover, it can be free at all, considering the fact that Azure Functions has 1 million requests per month for free.

Azure Functions with a timer trigger can execute a periodic action. For example, the system should check something in a database periodically and perform an action depending on the outcome. It could be a part of a reporting system that prepares reports, generates PDF files, and saves them in a specific directory.

A function in Azure that is triggered by Blob storage can be useful for file processing. For example, an application lets users upload an image file, and then the file should be cropped and compressed. In this case, an Azure Functions instance can be triggered automatically on the file upload and process the file.

Durable Functions can be used to handle long-running Web API calls. Since an orchestration function does not have a timeout limit, it can control a very long process. Moreover, during the process execution, a build-in status mechanism (status URL of the orchestration instance) provides execution details: status, input parameters, created and last update times, history events, and custom status that can be configured programmatically in the function body.

Use Durable Functions to execute multiple actions concurrently. You can then perform some aggregation on the results in a fan-out/fan-in scenario. For example, an application loads files in parallel and then counts the total size of files that are uploaded.

Azure Functions can be a “building block” for Azure Logic apps. The Azure Functions instance performs a specific job in the logic app. On the other hand, a logic app can be called from Azure Functions.

There are several reasons in addition to the use cases above to turn to Azure Cloud Services. From the functionality perspective, Azure Functions and Durable Functions come with automatic scaling on your workload volume, built-in high availability, and a variety of programming languages and hosting options. When it comes to a billing model, Azure Functions has a useful Consumption plan that’s based on per-second resource consumption and executions (i.e., grants 1 million requests per month for free). In terms of monitoring and analysis, Azure Cloud Services has a superior Application Insights service that can be connected to Azure Functions and allows you to monitor applications, detect performance anomalies, and use powerful analytics tools. Given all the detail above, I hope you have a stronger understanding of Azure Cloud Services and will move forward with confidence when using this serverless solution.

Professional services

Professional services

Read more