Article
0 comment

An Azure Function for an smart stupid web part – Part 2

In my first blog post of this series, I discussed how many intelligence you need in your web part when it comes to third-party API. Sometimes it makes more sense to remove the business logic out of the web part and use web parts just as the presentation layer. In the previous post, I also mentioned that might Azure Functions can be beneficial.

Text written on white background saying Azure Function for an Smart Stupid Web Part

The Azure Portal is capable of implementing any Azure Function directly on their user interface, but there is also another option. It is possible to write Azure Functions nowadays locally and deploy them later to the cloud. This approach is in most cases more convenient, less error-prone and more comfortable to debug directly from Visual Studio Code for example.

Local Azure Function

The SharePoint framework workbench is a great tool. You can write your web part without any connectivity to the internet too. (Once you have it set up and unless you like to access real data).

Azure Function supports local development in the same way. The article “Code and Test Azure Functions” locally” on docs.microsoft.com provides all the information you need to get started developing a local Azure Function.

Once you installed the prerequisite, you can start to create a new Azure function with the following command.

In the root project directory, a new folder with the specified name ‘VimeoRequest’ gets generated. It also installs all the requirements to run an Azure Function to this folder.

The initialisation of new Azure Function Host

Right now it is more a new empty project folder. The initialisation process doesn’t add any callable function. So it is more the boilerplate that you have right now.

Add the first function to Azure Functions host

Other than Yeoman and the SharePoint Framework the Azure Functions follow a CLI (Command-Line Interface) approach, similar to what the Office 365 CLI does.

To create the first call-able function, you have to execute the following command:

‘func new’ is the command to create a new function; the language switch defines how you prefer to write your code. In this case JavaScript. The ‘–template’ switch allows you to pass in what kind of Azure Function you like to write. In this case, we only need a simple HTTP Trigger. Finally, the ‘–name’ argument defines the name and folder of our Azure Function Endpoint.

The creation of the first Azure Function in Host Application

The creation of the first Azure Function in Host Application

After the execution, you will find a new folder and some files in it. The files in this folder are only three named ‘index.js’, ‘functions.json’ and ‘sample.dat’.

‘index.js’ stores the program code and this is the file where we add the functionality later. ‘functions.json’ allows you to configure the behaviour and binding of the Azure Function. In this case, the provided file content looks like this:

In the Azure Function Host documentation on GitHub complete documentation of all configurations is available.

For now, all is set up to launch the sample Azure Function for the first time.

Launch Azure Function Host

The command to start the local web server hosting the first function created is a simple call.

After the initialisation of the local Azure function host, you find the URL our service is running on the console.

First Run of local Azure Function

For the first test run use the following URL and past it in your browser.

Your browser then should display a proper “Hello World” on the screen.

Result of first access to Azure Function in Browser

The Result of first access to Azure Function in Browser

So now that everything is running, we can start to implement the actual web service.

To end the Azure Function just press ‘ctrl’ + ‘c’, This will stop the Azure Function Host and all running jobs.

JavaScript Azure Function is NodeJS

Whenever you create an Azure Function based on JavaScript, you have NodeJS also NodeJS running as the base technology. This circumstance is pretty convenient because you can use and install any NPM package to your Azure function.

Like I mentioned in my first blog article Vimeo provides you with an officially supported for NodeJS Client. The packages are available on NPM, but before you can install an NPM package, you need to change on the console to the ‘Search’ folder created previously. Every function only supports all their NodeJS packages directly inside their endpoint folder.

If you are looking forward to providing different functions with similar NPM packages you have to install all NPM packages multiple times.

Once in the Search folder, you have to create a new empty ‘package.json’ because this is needed later when we want to install the Azure Function in the cloud.

 

Creation of local NPM package in Azure Function

Just answer the questions similar like this, and you get a new empty ‘package.json’ that you can use to save all further references to packages needed.

Since we are not planning to publish this ‘package.json’ to NPM, you can add the “private” attribute and set it to true.

Now everything is prepared to install the official Vimeo Library.

The ‘–save’ switch makes sure that this library is a runtime requirement and referenced correctly in the ‘package.json’.

That is all for now for the basic setup, and the next step is to add the configuration as well as the code to access Vimeo.

Code and Configuration to access Vimeo

The first thing to do is to reference the Vimeo Client Library in the Azure Function. Like in regular SPFx projects you need to define a variable and import the library via a ‘require’. Add the following line of code at the beginning of your ‘index.js’ file.

Now everything is set up for the data access to Vimeo but for the authentication against this platform we need to have set up a proper App on the Vimeo Developer Portal. After the registration, you get an appropriate AppID and App Secret. This information is needed to retrieve an access token to operate against their API.

Screenshot of Vimeo APP ID - App Secret

Vimeo APP ID – App Secret

The APP-ID and secret shouldn’t get stored in the source code of your Azure function. This place is not the appropriate place not online and not in your local source code. In the root folder of the Azure Function Host, you find ‘local.settings.json’ file this is the right place to store such information.

The local host settings to store AppID and APP Secret

This file is the local equivalent to the Application Setting on Azure.

Access settings within the code

To access the APP-ID and APP Secret directly from the code the following three constants can be defined.

All the application settings are available in the environmental variables of the process and instead of writing ‘process.env.’ every time you can refer to this variables.

The rest of the code

The rest of the implementation is pretty much straightforward and would be worth another post, but this then would only contain specifics of the Vimeo Client. Instead, I posted it in a Gist on GitHub.

What it does is, it creates a new Vimeo Client pass in AppID, and APP-Secrets and the permission scope. In this case, the permission scope is public, to access all available public videos.

This third party API uses the OAuth 2.0 the first step is to retrieve the access token. This token gets assigned to the Vimeo client, and all following API calls will use it.

To keep it simple in this example I pass-through the HTTP-Header and HTTP-Footer as the result of the Azure Function.

So no transformation has been implemented.

HTTP vs HTTPS in Azure Functions

The default settings for local Azure Function is to work only under HTTP, but this is a problem because of the SharePoint Framework because workbench is running by default on HTTPS. So you try to access information from a less secure endpoint.

What to do now? From my perspective, you shouldn’t use anything other than HTTPS for your local development. Especially all remote third party API run on HTTPS which then can cause additional challenges. The better approach is to configure your local Azure Function Host to use SSL.

The easiest way to get a self-signed certificate is to create a self-signed certificate using OpenSSL.

You can get the required recommended binaries for your system from their Wiki Page.

Step 1 – Generate a Private Key

Once you installed and downloaded OpenSSL, you first need to create a private key. To generate such private key file, you need to execute the following command in PowerShell, Bash, zsh or whatever you use.

The console output should look like this:

You have to enter a passphrase to secure your private key and after that a ‘server.key’ file get generated in the root folder of the host. Make sure you don’t launch this command directly from inside your Azure Function Endpoint.

The next step is to create a new Certificate Signing Request.

Step 2 – Generate a Certificate Signing Request

To create a new certificate signing request execute the following command using OpenSSL again.

The argument ‘-key server.key’ use the private key previously created for the certificate signing request. Before that gets issued, you have to define additional information also known as X.509 attributes that get added to your certificate.

Again enter a password as the final step, and after that, you should find a new file named ‘server.csr’ in your project folder.

Step 3 – Generate a Self-Signed Certificate

After you created the certificate signing request successfully, you are all set to generate a new certificate. Again, through OpenSSL and the following command.

Through the ‘–days 730’ switch you make sure that your certificate is valid for 730 days. It also takes your certificate signing request (‘-in server.csr’) and your private key (‘-signkey server.key’) and stores your self-signed certificate as the output file name ‘server.crt’ file.

The output on your console should look similar like this.

Enter your private key passphrase again, and you finally have your needed certificate.

Not so fast because we first need to create a PKCS12 file as the final step.

Step 4 – PKCS12

Again and indeed the final OpenSSL command to execute. To run the Azure Function on SSL, you have to create a PKCS12 file, a Public-Key Cryptography Standards archive.

Simplified you take your ‘server.key’ and your ‘server.crt’ certificate and bundle them together in a file named ‘server.pfx’. Now, we have everything to launch our Azure Function Host using SSL.

Start Azure Host with SSL and CORS

The default port of our Azure Function host is port 7071. The workbench is running on localhost on port 4321. Both are localhost, but because of the different ports you need to enable CORS support for your Azure Function; otherwise, you won’t be able to request any data from this service. To enable CORS (cross-origin-resource-sharing) add “–cors ‘*'” to the launch parameter.

The second thing is our certificate and the password aka passphrase we used to build the PFX package need to get passed in.

The complete start command for our Azure Function Host is the following.

It is important here that you have to pass in the password, and the list of CORS allowed domains in apostrophes otherwise the Azure Function host crashes.

Local Azure Function running on HTTPS

Now everything is ready to access our custom web service built on Azure Functions by the Workbench. In the next blog post, I show how to use SharePoint Framework Web Part to search, select and embed videos from Vimeo.

I case you missed the other blogs check out the following links below: