Durable Functions: Fan Out Fan In Patterns

This post is a collaboration between myself and my awesome coworker, Maxime Rouiller.

Durable Functions? Wat. If you’re new to Durable, I suggest you start here with this post that covers all the essentials so that you can properly dive in. In this post, we’re going to dive into one particular use case so that you can see a Durable Function pattern at work!

Today, let’s talk about the Fan Out, Fan In pattern. We’ll do so by retrieving an open issue count from GitHub and then storing what we get. Here’s the repo where all the code lives that we’ll walk through in this post.

View Repo

About the Fan Out/Fan In Pattern

We briefly mentioned this pattern in the previous article, so let’s review. You’d likely reach for this pattern when you need to execute multiple functions in parallel and then perform some other task with those results. You can imagine that this pattern is useful for quite a lot of projects, because it’s pretty often that we have to do one thing based on data from a few other sources.

For example, let’s say you are a takeout restaurant with a ton of orders coming through. You might use this pattern to first get the order, then use that order to figure out prices for all the items, the availability of those items, and see if any of them have any sales or deals. Perhaps the sales/deals are not hosted in the same place as your prices because they are controlled by an outside sales firm. You might also need to find out what your delivery queue is like and who on your staff should get it based on their location.

That’s a lot of coordination! But you’d need to then aggregate all of that information to complete the order and process it. This is a simplified, contrived example of course, but you can see how useful it is to work on a few things concurrently so that they can then be used by one final function.

Here’s what that looks like, in abstract code and visualization

See the Pen Durable Functions: Pattern #2, Fan Out, Fan In by Sarah Drasner (@sdras) on CodePen.

const df = require('durable-functions') module.exports = df(function*(ctx) { const tasks = [] // items to process concurrently, added to an array const taskItems = yield ctx.df.callActivityAsync('fn1') taskItems.forEach(item => tasks.push(ctx.df.callActivityAsync('fn2', item)) yield ctx.df.task.all(tasks) // send results to last function for processing yield ctx.df.callActivityAsync('fn3', tasks)
})

Now that we see why we would want to use this pattern, let’s dive in to a simplified example that explains how.

Setting up your environment to work with Durable Functions

First things first. We’ve got to get development environment ready to work with Durable Functions. Let’s break that down.

GitHub Personal Access Token

To run this sample, you’ll need to create a personal access token in GitHub. If you go under your account photo, open the dropdown, and select Settings, then Developer settings in the left sidebar. In the same sidebar on the next screen, click Personal access tokens option.

Then a prompt will come up and you can click the Generate new token button. You should give your token a name that makes sense for this project. Like “Durable functions are better than burritos.” You know, something standard like that.

For the scopes/permission option, I suggest selecting “repos” which then allows to click the Generate token button and copy the token to your clipboard. Please keep in mind that you should never commit your token. (It will be revoked if you do. Ask me why I know that.) If you need more info on creating tokens, there are further instructions here.

Functions CLI

First, we’ll install the latest version of the Azure Functions CLI. We can do so by running this in our terminal:

npm i -g azure-functions-core-tools@core --unsafe-perm true

Does the unsafe perm flag freak you out? It did for me as well. Really what it’s doing is preventing UID/GID switching when package scripts run, which is necessary because the package itself is a JavaScript wrapper around .NET. Brew installing without such a flag is also available and more information about that is here.

Optional: Setting up the project in VS Code

Totally not necessary, but I like working in VS Code with Azure Functions because it has great local debugging, which is typically a pain with Serverless functions. If you haven’t already installed it, you can do so here:

Set up a Free Trial for Azure and Create a Storage Account

To run this sample, you’ll need to test drive a free trial for Azure. You can go into the portal and sign in the lefthand corner. You’ll make a new Blob Storage account, and retrieve the keys. Since we have that all squared away, we’re ready to rock!

Setting up Our Durable Function

Let’s take a look at the repo we have set up. We’ll clone or fork it:

git clone https://github.com/Azure-Samples/durablefunctions-apiscraping-nodejs.git 

Here’s what that initial file structure is like.

file structure for the durable function repo

(This visualization was made from my CLI tool.)

In local.settings.json, change GitHubToken to the value you grabbed from GitHub earlier, and do the same for the two storage keys — paste in the keys from the storage account you set up earlier.

Then run:

func extensions install
npm i
func host start

And now we’re running locally!

Understanding the Orchestrator

As you can see, we have a number of folders within the FanOutFanInCrawler directory. The functions in the directories listed GetAllRepositoriesForOrganization, GetAllOpenedIssues, and SaveRepositories are the functions that we will be coordinating.

Here’s what we’ll be doing:

  • The Orchestrator will kick off the GetAllRepositoriesForOrganization function, where we’ll pass in the organization name, retrieved from getInput() from the Orchestrator_HttpStart function
  • Since this is likely to be more than one repo, we’ll first create an empty array, then loop through all of the repos and run GetOpenedIssues, and push those onto the array. What we’re running here will all fire concurrently because it isn’t within the yield in the iterator
  • Then we’ll wait for all of the tasks to finish executing and finally call SaveRepositories which will store all of the results in Blob Storage

Since the other functions are fairly standard, let’s dig into that Orchestrator for a minute. If we look inside the Orchestrator directory, we can see it has a fairly traditional setup for a function with index.js and function.json files.

Generators

Before we dive into the Orchestrator, let’s take a very brief side tour into generators, because you won’t be able to understand the rest of the code without them.

A generator is not the only way to write this code! It could be accomplished with other asynchronous JavaScript patterns as well. It just so happens that this is a pretty clean and legible way to write it, so let’s look at it really fast.

function* generator(i) {
yield i++;
yield i++;
yield i++;
} var gen = generator(1); console.log(gen.next().value); // 1
console.log(gen.next().value); // 2
console.log(gen.next().value); // 3
console.log(gen.next()); // {value: undefined, done: true}

After the initial little asterisk following function*, you can begin to use the yield keyword. Calling a generator function does not execute the whole function in its entirety; an iterator object is returned instead. The next() method will walk over them one by one, and we’ll be given an object that tells us both the value and done — which will be a boolean of whether we’re done walking through all of the yield statements. You can see in the example above that for the last .next() call, an object is returned where done is true, letting us know we’ve iterated through all values.

Orchestrator code

We’ll start with the require statement we’ll need for this to work:

const df = require('durable-functions') module.exports = df(function*(context) { // our orchestrator code will go here
})

It’s worth noting that the asterisk there will create an iterator function.

First, we’ll get the organization name from the Orchestrator_HttpStart function and get all the repos for that organization with GetAllRepositoriesForOrganization. Note we use yield within the repositories assignment to make the function perform in sequential order.

const df = require('durable-functions') module.exports = df(function*(context) { var organizationName = context.df.getInput() var repositories = yield context.df.callActivityAsync( 'GetAllRepositoriesForOrganization', organizationName )
})

Then we’re going to create an empty array named output, create a for loop from the array we got containing all of the organization’s repos, and use that to push the issues into the array. Note that we don’t use yield here so that they’re all running concurrently instead of waiting one after another.

const df = require('durable-functions') module.exports = df(function*(context) { var organizationName = context.df.getInput() var repositories = yield context.df.callActivityAsync( 'GetAllRepositoriesForOrganization', organizationName ) var output = [] for (var i = 0; i < repositories.length; i++) { output.push( context.df.callActivityAsync('GetOpenedIssues', repositories[i]) ) } })

Finally, when all of these executions are done, we’re going to store the results and pass that in to the SaveRepositories function, which will save them to Blob Storage. Then we’ll return the unique ID of the instance (context.instanceId).

const df = require('durable-functions') module.exports = df(function*(context) { var organizationName = context.df.getInput() var repositories = yield context.df.callActivityAsync( 'GetAllRepositoriesForOrganization', organizationName ) var output = [] for (var i = 0; i < repositories.length; i++) { output.push( context.df.callActivityAsync('GetOpenedIssues', repositories[i]) ) } const results = yield context.df.Task.all(output) yield context.df.callActivityAsync('SaveRepositories', results) return context.instanceId
})

Now we’ve got all the steps we need to manage all of our functions with this single orchestrator!

Deploy

Now the fun part. Let’s deploy! 🚀

To deploy components, Azure requires you to install the Azure CLI and login with it.

First, you will need to provision the service. Look into the provision.ps1 file that’s provided to familiarize yourself with the resources we are going to create. Then, you can execute the file with the previously generated GitHub token like this:

.\provision.ps1 -githubToken <TOKEN> -resourceGroup <ResourceGroupName> -storageName <StorageAccountName> -functionName <FunctionName>

If you don’t want to install PowerShell, you can also take the commands within provision.ps1 and run it manually.

And there we have it! Our Durable Function is up and running.

The post Durable Functions: Fan Out Fan In Patterns appeared first on CSS-Tricks.

Create your own Serverless API

If you don’t already know of it, Todd Motto has this great list of public APIs. It’s awesome if you’re trying out a new framework or new layout pattern and want to hit the ground running without fussing with the content.

But what if you want or need to make your own API? Serverless can help create a nice one for data you’d like to expose for use.

Serverless really shines for this use case, and hopefully this post makes it clear why. In a non-serverless paradigm, we have to pick something like express, we have to set up endpoints, we have to give your web server secured access to your database server, you have to deploy it, etc. In contrast, here we’ll be able to create an API in a few button clicks, with minor modifications.

Here’s the inspiration for this tutorial: I’ve been building a finder to search for new cocktails and grab random one. I originally started using ae public API but realized quickly that I found the contents limiting and wanted to shape my own.

I’m going to use Azure for these examples but it is possible to accomplish what we’re doing here with other cloud providers as well.

Make the Function

To get started, if you haven’t already, create a free Azure trial account. Then go to the portal: preview.portal.azure.com.

Next, we’ll hit the plus sign at the top left and select Serverless Function App from the list. We can then fill in the new name of our function and some options. It will pick up our resource group, subscription, and create a storage account from defaults. It will also use the location data from the resource group. So, happily, it’s pretty easy to populate the data.

Next, we’ll create a new function using HTTP Trigger, and go to the Integrate tab to perform some actions:

What we did was:

  • Give the route template a name
  • Change the authorization level to “Anonymous”
  • Change the allowed HTTP methods to “Selected Method”
  • Select “GET” for the selected method and unchecked everything else

Now, if we get the function URL from the top right of our function, we’ll get this:

https://sdras-api-demo.azurewebsites.net/api/hello

The initial boilerplate testing code that we’re given is:

module.exports = function (context, req) { context.log('JavaScript HTTP trigger function processed a request.'); if (req.query.name || (req.body && req.body.name)) { context.res = { // status: 200, /* Defaults to 200 */ body: "Hello " + (req.query.name || req.body.name) }; } else { context.res = { status: 400, body: "Please pass a name on the query string or in the request body" }; } context.done();
};

Now if we go to the URL below, we’ll see this:

https://sdras-api-demo.azurewebsites.net/api/hello/?name=TacoFace
Says hello TacoFace

There’s more information in this blog post, including API unification with Function Proxies. You can also use custom domains, not covered here.

OK, now that that initial part is all set up, let’s find a place to host our data.

Storing the data with CosmosDB

There are a number of ways to store the data for our function. I wanted to use Cosmos because it has one-click integration, making for a pretty low-friction choice. Get a free account here. Once you’re in the portal, we’ll go to the plus sign in the top left to create a new service and this time select “CosmosDB.” In this case, we chose the SQL version.

We have a few options for how to create our documents in the database. We’re merely making a small example for demo purposes, so we can manually create them by going to Data Explorer in the sidebar. In there, I made a database called CocktailFinder, a collection called Cocktails, and added each document. For our Partition Key, we’ll use /id.

creating a collection

In real practice, you’d probably either want to upload a JSON file by clicking the “Upload JSON” button or follow this article for how to create the files with the CLI.

We can add something in JSON format like this:

{ "id": "1", "drink": "gin_and_tonic", "ingredients": [ "2 ounces gin", "2 lime wedges", "3–4 ounces tonic water" ], "directions": "Add gin to a highball glass filled with ice. Squeeze in lime wedges to taste, then add them to glass. Add tonic water; stir to combine.", "glass": [ "highball", "hurricane" ], "image": "https://images.unsplash.com/photo-1523905491727-d82018a34d75?ixlib=rb-0.3.5&ixid=eyJhcHBfaWQiOjEyMDd9&s=52731e6d008be93fda7f5af1145eac12&auto=format&fit=crop&w=750&q=80"
}

And here is the example with all three that we’ve created, and what Cosmos adds in:

Have the function surface the data

OK, now we’ve got some dummy data to work with, so let’s connect it to our serverless function so we can finish up our API!

If we go back to our function in the portal and click Integrate in the sidebar again, there’s a middle column that called Inputs. Here, we can click on the plus that says “+ New Inputs” and a few options come up. We’ll click the CosmosDB option and “Create.”

Showing in the portal the button to create a connection with CosmosDB

A prompt will come up that asks us to provide information about our Database and Collection. If you recall, the databaseName was CocktailFinder and the collectionName was Cocktails. We also want to use the same partitionKey which was /id. We’ll use all the other defaults.

Now if we go to our function.jso, you can see that it’s now been updated with:

{ "type": "documentDB", "name": "inputDocument", "databaseName": "CocktailFinder", "collectionName": "Cocktails", "partitionKey": "/id", "connection": "sdrassample_DOCUMENTDB", "direction": "in"
}

We’re going to use that inputDocument to update our testing function to reflect that’s what we’re trying to access. I also add in the status and log frequently, but that’s optional.

module.exports = function (context, req) { context.log('JavaScript HTTP trigger function processed a request.'); if (context.bindings) { context.log('Get ready'); context.res = {status: 200, body: context.bindings.inputDocument}; context.log(context.res); } else { context.res = { status: 400, body: "Something went wrong" }; } context.done();
};

Now you can see our function at work!

CORS!

Can’t forget the axios call. I have a really barebones CodePen implementation so that you can see how that might work. In Vue, we’re hooking into the created lifecycle method and making the GET request, in order to output the JSON data to the page.

See the Pen Show API Output, beginning of Vue.js integration by Sarah Drasner (@sdras) on CodePen.

From here, you have all that data to play with, and you can store whatever you like in whatever form you fancy. You can use something like Vue, React, or Angular to create interfaces with the content we stored in our database, and our serverless function creates the API for us. The sky is the limit to what we can create! 🎉

The post Create your own Serverless API appeared first on CSS-Tricks.

Displaying the Weather With Serverless and Colors

I like to jog. Sometimes it’s cold out. Sometimes it’s cold out, but it looks like it isn’t. The sun is shining, the birds are chirping. Then you step outside in shorts and a t-shirt and realize you have roughly 2 minutes before exposure sets in.

I decided to solve this first world problem using a lightbulb to display a certain color based on what the temperature outside is. It works better than I expected, and that’s saying something because usually nothing works out like I want it to.

This was a fun project to build, and since it is essentially a hosted service running on a timer, it’s a perfect use case for Serverless.

Now you might be thinking, “um, wouldn’t it be easier to just check the weather?” Well it would, but then I wouldn’t have an excuse to buy an expensive lightbulb or write an article with the word “Serverless.”

So let’s look at how you can build your own Weather Bulb. The final code is not complicated, but it does have some interesting pieces that are worth noting. By the way, did I mention that it’s Serverless?

Building the Weather Bulb

The first thing you are going to need is the bulb. You can’t have a Weather Bulb sans bulb. Say the word “bulb” out loud about 10 times and you’ll notice what a bizarre word it is. Bulb, bulb, bulb, bulb — see? Weird.

I am using the LIFX Mini Color. It’s not *too* expensive, but more importantly, it’s got an API that is wide open.

The API has two methods of authentication. The first contains the word “OAuth” and I’m already sorry that you had to read that. Don’t worry, there is an easier way that doesn’t involve OAu…. that which won’t be named.

The second way is to register an application with LIFX. You get back a key and all you have to do is pass that key with any HTTP request. That’s what I’m using for this demo.

For instance, if we wanted to change the bulb color to blue, we can just pass color: blue to the /state endpoint.

The API supports a few different color formats, including named colors (like red, blue), hex values, RBG, Kevlin, hue brightness and saturation. This is important because it factors into what proved to be the hardest part of this project: turning temperature into color.

Representing Temperature With Color

If you’ve ever watched a weather report, you’ll be familiar with the way that meteorology represents weather conditions with color on a map.

Usually, this is done to visualize precipitation. You have probably seen that ominous green strip of storms bearing down on you on a weather map while you try to figure out if you should get in the bathtub because you’re in the path of a tornado. Or maybe that’s just all of us unlucky souls here in America’s Tornado Alley.

Color is also used to represent temperature. This is precisely what I wanted to do with the bulb. The tough thing is that there doesn’t seem to be a standardized way to do this. Some maps show it as solid colors in bands. In this case blue might represent the band from 0℉ – 32℉.

Others have it as a gradient scale which is more precise. This is what I was after for the Weather Bulb.

My first stab at solving this was just to Google “temperature color scale” and other various iterations of that search term. I got back a lot of information about Kelvin.

Kelvin is a representation of the temperature of a color. Literally. For any light source (light bulb, the sun, ect) the actual temperature of that source will affect the color of the light it emits. A fire burns a yellowish red color. The hotter that fire gets, the more it moves towards white. Hence the saying, “white hot”. So if someone ever says “red hot,” you can correct them in front of everyone because who doesn’t love a pedantic jerk?

The LIFX bulb supports Kelvin, so you might think that this would work. After all, this is the Kelvin scale….

The problem is that there is simply not enough color variation because these are not actual colors, but rather the tinge of color that a light is emitting based on it’s “temperature.” Here is the Kelvin color wheel that comes with the LIFX app.

These colors are barely distinguishable from one another on the bulb. Not exactly what I was after.

That leaves me with trying to convert the color to either Hex, RGB or some other format. This is tough because where do you begin? I spent an embarrassing amount of time adjust RGB scale values between blue for cold (0, 0, 255) and red for hot (255, 0, 0). It was about this time that it dawned on me that maybe HSL would be a better way to go here. Why? Because hue is a whole lot easier to understand.

Hue

Hue is a representation of color on a scale between 0 and 360. This is why we often see color represented on a wheel (360°). That’s a vast oversimplification, but unless you want me to start talking about wavelengths, let’s go with that definition.

The hue color wheel looks like this….

If we flatten it out, it’s easier to reason about.

We’re ready to convert temperature to color. The first thing we need to do is figure out a set temperature range. I went with 0℉ to 100℉. We can’t work with infinite temperature color combinations. Numbers go on forever, colors do not. It can only get so hot before our bulb is just bright red all the time, and that’s 100℉. The same is true for cold.

If light blue represents 0℉, I can start at about the 200 mark on the hue scale. Red will represent 100℉. You can see that red is at both extremes, so I can move either left OR right, depending on what colors I want to use to represent the temperature. It’s not the same as the colors they use in actual weather programs, but who cares? Obviously not me.

I chose to go right because there is no pink on the left and pink is my favorite color. I also felt like pink represents warm a bit better than green. Green is rain and tornadoes.

Now we can back into a hue based on temperature. Ready? Here we go.

Let’s pretend it’s a brisk 50℉ outside.

If 100℉ is the hottest we go (360) and 0℉ is the coldest (200), then we have a color scale of 160 points. To figure out where in that 160 point range we need to be, we can divide the current temperature by the upper bound of 100℉ which will give us the exact percentage we need to move in our range, or 50%. If we move 50% of the way into a 160 point range, that leaves us at 80. Since we are starting at 200, that gives us a hue of 280.

That sounds complicated, but only because word problems in math SUCK. Here’s how the code looks when it’s all said and done…

let hue = 200 + (160 * ( temperature / 100 ));

OK! We’ve got a dynamic color scale based on hue, and wouldn’t you know it, we can just pass the hue to LIFX as simply as we pass a named color.

Now we just need to find out what the current temperature is, back into a hue and do that every few minutes. Serverless, here we come!

Serverless Timer Functions

Serverless is all the rage. It’s like HTML5 used to be: it doesn’t matter what it is, it only matters that you know the word and are not afraid to use it in a blog post.

For this example, we’ll use Azure Functions because there is support for timer triggers, and we can test those timer triggers locally before we deploy using VS Code. One of the things about Serverless that irritates me to no end is when I can’t debug it locally.

Using the Azure Functions Core Tools and the Azure Functions Extension for VS Code, I can create a new Serverless project and select a Timer Trigger.

Timer Triggers in Azure Functions are specified as Cron Expressions. Don’t worry, I didn’t know what that was either.

Cron Expressions allow you to get very specific with interval definition. Cron breaks things down into second, minute, hour, day, month, year. So if you wanted to run something every second of every minute of every hour of every day of every year, your expression would look like this…

* * * * * *

If you wanted to run it every day at 10:15, it would look like this…

* 15 10 * * *

If you wanted to run it every 5 minutes (which is what Azure defaults to), you specify that by saying “when minutes is divisible by 5.”

0 */5 * * * *

For the purposes of this function, we set it to 2 minutes.

I am using a 2 minute interval because that’s how often we can call the weather API for free 💰.

Getting the Forecast From DarkSky

DarkSky has a wonderful weather API that you can call up to 1,000 times per day for free. If there are 1,440 minutes in a day (and there are), that means we can call DarkSky every 1.44 minutes per day and stay in the free zone. I just rounded up to 2 minutes because temperature doesn’t change that fast.

This is what our function looks like when we call the DarkSky API. All of my tokens, keys, latitude and longitude settings are in environment variables so they aren’t hardcoded. Those are set in the local.settings.json file. I used axios for my HTTP requests because it is a magical, magical package.

const axios = require('axios'); module.exports = function (context, myTimer) { // build up the DarkSky endpoint let endpoint = `${process.env.DS_API}/${process.env.DS_SECRET}/${process.env.LAT}, ${process.env.LNG}`; // use axios to call DarkSky for weather axios .get(endpoint) .then(response => { let temp = Math.round(response.data.currently.temperature); // TODO: Set the color of the LIFX bulb }) .catch(err => { context.log(err.message); });
};

Now that I have the temperature, I need to call the LIFX API. And wouldn’t you know it, someone has already created an npm package to do this called lifx-http-api. This is why you love JavaScript.

Setting the Bulb Hue

After the weather result comes back, I need to use the LIFX API instance and call the setState method. This method returns a promise which means that we need to nest promises. Nesting promises can get out of hand and could land us right back in callback hell, which is what we’re trying to avoid with promises in the first place.

Instead, we’ll handle the first promise and then return Promise.all which we can handle at another top-level then. This just prevents us from nesting then statements.

Remember kids, promises are just socially acceptable callbacks.

const axios = require('axios');
const LIFX = require('lifx-http-api'); let client = new LIFX({ bearerToken: process.env.LIFX_TOKEN
}); module.exports = function (context, myTimer) { // build up the DarkSky endpoint let endpoint = <code>${process.env.DS_API}/${process.env.DS_SECRET}/${ process.env.LAT },${process.env.LNG}<code>; // use axios to call DarkSky for weather axios .get(endpoint) .then(response => { let temp = Math.round(response.data.currently.temperature); // make sure the temp isn't above 100 because that's as high as we can go temp = temp < 100 ? temp : 100; // determine the hue let hue = 200 + (160 * (temp / 100)); // return Promise.all so we can resolve at the top level return Promise.all([ data, client.setState('all', { color: <code>hue:${hue}<code> }) ]); }) .then(result => { // result[0] contains the darksky result // result[1] contains the LIFX result context.log(result[1]); }) .catch(err => { context.log(err.message); });
};

Now we can run this thing locally and watch our timer do it’s thang.

That’s it! Let’s deploy it.

Deploying Weather Bulb

I can create a new Functions project from the VS Code extension.

I can right-click that to “Open in portal” where I can define a deployment source so it sucks my code in from Github and deploys it. This is ideal because now whenever I push a change to Github, my application automatically gets redeployed.

All Hail the Weather Bulb

Now just sit back and behold the soft glow of the Weather Bulb! Why look at the actual temperature when you can look at this beautiful shade of hot pink instead?

Can you guess what the temperature is based on what you know from this article? The person who leaves a comment and gets the closest will get a free LIFX lightbulb from me (because I ❤️ all of you), or the cost of the bulb if you are outside the U.S. (~$40).

You can grab all of the code for this project from Github.

The post Displaying the Weather With Serverless and Colors appeared first on CSS-Tricks.