Azure Function Development on MacOS with Docker

Creating a queue triggered Azure Function and running locally using Docker

As a developer, one question that always comes up when using any cloud provider as your platform of choice is, “How do I integrate cloud services with my development environment?”

You have a two main options, provide a full cloud environment for each developer (can be expensive and tricky to set up correctly), or run everything locally on your dev machine. There are, of course, hybrid solutions but when starting a new project the quickest and easiest solution is often to run locally.

In this post I will show how to create a queue triggered Azure Function and get it running in a local development environment using Docker on MacOS. The source code for all of this is available on my github.

A Problem to Solve

First lets consider an example of where we might want to use a queue triggered Azure Function — Asynchronous messaging.

Asynchronous messaging is a pattern used in many architectures to decouple services. The idea is that rather than communicating directly, services communicate via messages that are typically placed on a queue. The services then no longer rely on each other, so if one becomes unavailable the other can carry on unaffected.

By combining the asynchronous messaging pattern with serverless, where the push of a message to a queue triggers the invocation of function, you can stitch together these loosely coupled services with minimal configuration.

On a recent project we used asynchronous messaging to decouple a user-facing front-end web app from a data processing back-end.

On the front-end web app the user enters data into an online application form. This data is then pushed as a Json message to an Azure Queue Storage queue, triggering an Azure Function. This function reads the message, performs some transformations on the data and ultimately uploads a file to an Azure Blob Storage container.

We’ll use this use case as the basis of our example in this post.

The Devs Toolset

Mircosoft have always provided a great suite of developer tools in Visual Studio. They also provide utilities for developers who are more at home on the command line, dotnet for creating and running .net projects and func for creating and running Azure Functions.

I come from a varied dev background and I am never to far away from a terminal. I prefer using command line utilities compared to GUI interfaces, so these tools are a great fit for me.

As mentioned, I’ll be using Docker for running everything in our local dev environment. From experience, as the number of components in your project increases it always makes sense to start using Docker, so you should just use it from the beginning.

I’ll also be using Docker-compose for container orchestration. This will allow us to create and run all the docker containers we need with a single command.

Creating the Project

First up, we need to create the solution and project scaffolding. To do this, run these commands in your terminal of choice:

Setup the QueueTriggeredFunction solution and project

This creates the solution, the project, all associated folders and some template files. The most notable files created are the class QueueTriggeredFunction, which is the main entry point for the function and local.settings.json which contains the settings needed to run the function locally using the command line utilities (or Visual Studio).

Now that we have the project scaffolding, lets wire things together and write some code to do something with the queue message. The main entry point of the Azure Function looks like this:

QueueTriggeredFunction — the main class of the azure function

The parameters “SourceQueueName” and “SourceQueueConnection” in the “Run” method definition refer to parameters contained in the “Values” section of the local.settings.json file. Note that SourceQueueName is wrapped in % symbols, this is necessary to indicate it should be read from settings and not taken as a literal string. When deploying to Azure these values should be added as AppSettings in the function, and when running in a docker container they should be added as environment variables.

The settings used when running locally.

I’ve create a new class, QueueTriggeredFunctionRunner, containing the main functionality of the Azure Function. It’s best practice to treat the main entry point of the function as you would a main class in a console app, use it only for dependency creation and orchestration, handing off the main logic to another class.

QueueTriggeredFunctionRunner — class containing the logic of ourAzure Function

The logic of our function is kept simple, it takes the message passed in from the queue and uploads the contents to an Azure Blob Storage container as a txt file.

At this point we could run our function on our dev machine, but there is no queue to read from and no container to upload to, so nothing will happen.

We want our dev environment to be local, so that means we’ll need to emulate Azure Storage locally somehow. Enter Azurite.

Dockerise Things

Azurite is an Azure Storage emulator that can run natively as an app or run inside a docker container. We are going to run it in a docker container.

As mentioned above we’re also going to use docker-compose for container orchestration, it just makes things easier in the long run as you add more services. The docker-compose.yml file lives in the root folder of the solution and at this stage the contents look like this:

Docker compose for Azurite

It starts up Azurite as well as an Azure-Cli container that creates the queue and storage container required for the function to work.

To run everything all we need to do is run the commands:

docker-compose up -d
cd FunctionsProj
func start

To interact with Azurite you can use the Azure Storage Explorer. Once started it will automatically pick up the Azurite container. Now you can add messages to the queue and see them magically transported to your blob storage container!

Dockerise All The Things

So far we’ve got Azurite running in a docker container and we are manually running the Azure Function. This is great during development of the function as we are able quickly make and test changes. But wouldn’t it also be great if we could have the function running in the background when we are developing other integrations — for example, an application that posts to the storage queue that’s triggering our function. Luckily, we can run the function in another docker container.

First, create a Dockerfile in the root folder of the solution. This contains instructions on how to build the function docker container.

Dockerfile for the azure function

Then add this container to the docker-compose.yml, defining the settings as environment variables.

Docker-compose including the azure function

Before running docker-compose we need to build and publish the Azure Function artefacts to the folder the Dockerfile expects.

dotnet clean
dotnet build FunctionsProj
dotnet publish FunctionsProj/FunctionsProj.csproj -c Release -o ./published/FunctionsProj

Now just run docker-compose up -d --build . The Azure Function will start up inside a docker container and will be ready and waiting to process anything that is pushed to the queue.


So there you have it, I’ve shown how to create a queue triggered Azure Function and run it in a local development environment on MacOS. As you create more services you can add them into your docker-compose.yml, but there may come a point when your system is too big and sprawling to run everything locally. It’s also not always possible or sensible to run everything locally in docker so you’ve got to be pragmatic and be open to alternative approaches when setting up your dev environment.

So long and good luck!!

Senior Software Engineer with Kainos working in Belfast, NI. Former Physicist turned Software Engineer with an interest in solving problems with code