Exploring Azure Functions- Bindings

Recap

In my previous article, we have seen how EF Core can be implemented with HTTP trigger. If you haven’t read my previous articles then I highly encourage you to do

Introduction

Before going into Bindings, we first need to understand about Triggers as Bindings cannot co-exists without Azure Function triggers. We have already discussed about different types of triggers in Azure Function Introduction article.

Triggers are what can cause a function to run. A trigger defines how the function is called upon; however, every function has one trigger and a optional bindings.

Binding is the connection to data within your Azure Functions. There are two of Bindings

  • Input Binding: A input binding is the data that your function receives.
  • Output Binding: A output binding is the data that your function sends.

To better understand input and output binding, let’s take an example

Azure Function is set to trigger every 5 minutes. The function reads an image from the blob container & sends the email to the intended recipients

  • Timer is a Trigger
  • Input Binding is the one that reads the image from blob container
  • Sending an email to a recipient is an output binding

Coding

In this article, we’ll concentrating only on Bindings rather than deploying an application to Azure. In the upcoming articles, we’ll be looking into multiple ways of deploying our application in Azure.

Scenario 1

Firstly, I’ll be extending the code that we have implemented in the previous article i.e. Azure Function-HTTP Trigger using EF Core. Upon saving the data to the database, Queue will be used as an output binding to save the data to the local storage.

Secondly, another function will be created upon insertion of data to the Queue store. Here Queue act as a input binding and the queues data will be store in the Blob container as an output Binding.

Prerequisites

  • Add nuget package- Microsoft.Azure.WebJobs.Extensions.Storage
  • Azure Storage Emulator for testing in local environment.

You can download the emulator from here / run it in Docker container by using below steps

docker pull microsoft/azure-storage-emulator

docker run -p 10000:10000 -p 10001:10001 -p 10002:10002 microsoft/azure-storage-emulator

In the previous article, we have seen how employee record has been inserted into database by using HTTP trigger.

[FunctionName("SaveEmployee")]
        public async Task<ActionResult> SaveEmployeeAsync([HttpTrigger(AuthorizationLevel.Anonymous,"post")] HttpRequest req,
            ILogger log)
        {
            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            var data= JsonConvert.DeserializeObject<Employee>(requestBody);
            await employeeContext.Employees.AddAsync(data);
            await employeeContext.SaveChangesAsync();
            return new OkResult();
        }

In order to have Queue as the Output binding, we need to specify Queue Attribute either in the functions method(incase you are returning the response as Queue) or you can use it as a method parameter in the function.

In the above example, we cannot use response as Queue because we are already sending HTTP response(200 OK) to the client. In such scenarios, we have to use the Queue attribute in the functions method parameter using a peculiar interface IAsyncCollector<T> or ICollector<T>. You might have already guessed it, IAsyncCollector<T> for Asynchronous operation whereas, ICollector<T> for Synchronous operation. However, in this example, we’ll be using IAsyncCollector<T>.

IAsyncCollector is an object exposed in Azure to hold a collection of items that can be read and stored asynchronously.

[FunctionName("SaveEmployee")]
        public async Task<ActionResult> SaveEmployeeAsync([HttpTrigger(AuthorizationLevel.Anonymous,"post")] HttpRequest req,
            [Queue("employee")] IAsyncCollector<Employee> empCollector,
            ILogger log)
        {
            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            var data= JsonConvert.DeserializeObject<Employee>(requestBody);
            await employeeContext.Employees.AddAsync(data);
            await employeeContext.SaveChangesAsync();
            await empCollector.AddAsync(data);
            return new OkResult();
        }

Here we are adding the data to the Queue using AddAsync method.

Please make sure that AzureWebJobsStorage is using the dev storage in the local.settings.json file.

“AzureWebJobsStorage”: “UseDevelopmentStorage=true”

With these changes, we’ll be able to see employee information in the local Queue storage by using emulator.

Now, we need to implement Queue as a input binding and the queues data will be store in Blob container as an output. Here, we need to create a new Queue function trigger class

public class BlobOutputFromQueue
    {
        [FunctionName("QueueTrigger")]
        public void QueueTriggerAndBlobOutput(
            [QueueTrigger("employee", Connection = "AzureWebJobsStorage")] Employee employee,
            [Blob("employee/{rand-guid}.json")] TextWriter textWriter)
        {
            textWriter.WriteLine($"id:{employee.Id}");
            textWriter.WriteLine($"Name:{employee.Name}");
            textWriter.WriteLine($"Age:{employee.Age}");
            textWriter.WriteLine($"City:{employee.City}");
            textWriter.WriteLine($"State:{employee.State}");
        }
    }

In the above sample, QueueTrigger is used as an input binding. We are specifying the Queue name(employee) and connection string in the QueueTrigger attribute.

Blob container is used as output binding by specifying the Blob attribute. Here, “employee” is the name of the blob container and for every queue trigger, new guid id will be generated in the container.

See how easy it is to implement these storages in the Azure Functions and it helps to eliminate boilerplate code.

Scenario 2

Rather than using the previous EF core example, I’m using the simpler example.

Firstly, upon triggering of HTTP request. The HTTP request body will be send as a request to Server Bus-Queue by using an output binding.

Secondly, after the message is sent to server bus queue, function will be triggered and the data will be logged in the application.

Prerequisites

  • Add nuget package- Microsoft.Azure.WebJobs.Extensions.ServiceBus
  • Create a new Azure service bus queue by referring to my article here
[FunctionName("HttpToServiceBusQueue")]
      
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
            [ServiceBus("testqueue",Connection ="connectionString")] IAsyncCollector<string> outputEvents,
            ILogger log)
        {
            log.LogInformation("C# HTTP trigger function processed a request.");

            string name = req.Query["name"];

            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic data = JsonConvert.DeserializeObject(requestBody);
            name = name ?? data?.name;
            await outputEvents.AddAsync(requestBody);
            string responseMessage = string.IsNullOrEmpty(name)
                ? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
                : $"Hello, {name}. This HTTP triggered function executed successfully.";
            return new OkObjectResult(responseMessage);
        }

We have use ServiceBus attribute as a output binding; however, you need to specify the queue name(created in Azure) and the connection string from the “Shared Access Policies” in the Azure portal.

[FunctionName("servicebusQueue")]
        public static void Run([ServiceBusTrigger("testqueue", Connection = "AzureWebJobsStorage")] string myQueueItem, ILogger log)
        {
            log.LogInformation($"C# ServiceBus queue trigger function processed message: {myQueueItem}");
        }

I’m just logging the data that we received from the ServiceBus queue.

Finally, we have managed to put all the code changes in place.

I hope you like the article. In case, you find the article as interesting then kindly like and share it.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s