This is the second of two blog posts for this topic; this one takes the Node.js app running that is working against Azure Storage Blobs for image manipulation and get the app running in an Azure Function, with an external executable running in the context of an Azure App Service. Please see Part 1 to get caught up if necessary.
First, let me do an important disclaimer, before I even talk about anything:
This Sample Code is provided for the purpose of illustration only and is not intended to be used in a production environment. THIS SAMPLE CODE AND ANY RELATED INFORMATION ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A PARTICULAR PURPOSE. We grant You a nonexclusive, royalty-free right to use and modify the Sample Code and to reproduce and distribute the object code form of the Sample Code, provided that You agree: (i) to not use Our name, logo, or trademarks to market Your software product in which the Sample Code is embedded; (ii) to include a valid copyright notice on Your software product in which the Sample Code is embedded; and (iii) to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of the Sample Code.
OK, now that that's out of the way... last time, I left you with a running Node.js app local on your machine that knew how to read a picture from an Azure Blob, then call out to a local-to-the-app copy of the GraphicsMagick executable through the gm module to manipulate that image, then write that image back to a new Azure Blob. Looking back at that code, it's all pretty straightforward, especially given that the executable can work with pipes, and the module exposes that.
So, you have this code now, and it's working on your machine. Let's get that into the Azure Function app! Last time, you created that app, and the underlying Azure Storage Account, which is what you're working with for the images. That app as mentioned is using a local executable. Luckily for my demo, that executable and its associated libraries are essentially built in a "portable" way, so everything strictly needed is local to the installation folder. That makes it easy for us here, because we can just put that folder up there.
Start with creating a ZIP file with the necessary application parts. Last time, the executables were copied wholesale out of the Program Files folder to the utils folder of the code. Open that folder in File Explorer, and delete the folders and files that aren't related to the executable. (You don't strictly need to do this, but why pay for storage you don't need to pay for?) This means deleting the following folders:
and, in the case of version 1.3.36 on Windows, which is what I have installed, the following files:
You now have a "minimum" install folder. (It's important to note that I'm not demoing any of the GraphicsMagick functionality that uses GhostScript; if I was, I'd have to deal with that dependency.) In the utils folder, ZIP the GraphicsMagick-version folder through right-click, Send to, Compressed (zipped) folder or, on Windows 11, right-click, Compress to ZIP file. (You can do this however you want, of course, if this is not your preferred method.) Hold on to this ZIP file for now; it's going to come up later on after some other work in the Azure Portal.
Open the Azure Portal and navigate to your Azure Function App Service (e.g. through Search at the top) if you are not already there (we left things at the Storage Account, so you probably aren't there.) Under Functions, select Functions, then in the command bar at the top, select + Create. When prompted, choose to Develop in portal with the HTTP trigger template. Both of these are "because we're doing this as a demo" choices... in production, you might do something like an Azure Blob Storage trigger to automatically process blobs as created, or have a proper deployment pipeline as previously mentioned.
Once you select the HTTP trigger template, a new pair of fields where appear at the bottom. Enter a good name for the New Function (this name does not need to be unique across the Azure cloud you're working in, unlike the Azure Function App name) and leave the Authorization level at the default of Function. The Create operation will happen very quickly - after a few seconds, generally.
Once you have the function in place, we can put in the function code. Under Developer, select Code + Test, which will open up a decent text editor in the browser. Make the following changes:
This will leave your function body looking like this:
module.exports = async function (context, req) {
context.res = {
// status: 200, /* Defaults to 200 */
};
}
Now:
module.exports = async function (context, req) {
// #region Azure Storage stuff
const {
BlobServiceClient,
StorageSharedKeyCredential,
newPipeline
} = require('@azure/storage-blob');
//const accountName = process.env.AZURE_STORAGE_ACCOUNT_NAME;
//const accountKey = process.env.AZURE_STORAGE_ACCOUNT_ACCESS_KEY;
const accountName = 'nevergonnagive';
const accountKey = 'youupnevergonnaletyoudownnevergonnaturnaroundandhurtyou==';
const sharedKeyCredential = new StorageSharedKeyCredential(
accountName,
accountKey);
const pipeline = newPipeline(sharedKeyCredential);
const blobServiceClient = new BlobServiceClient(
`https://${accountName}.blob.core.windows.net`,
pipeline
);
// #endregion
// #region GraphicsMagick stuff
var gm = require('gm').subClass({
appPath: __dirname + '/utils/GraphicsMagick-1.3.35-Q16/'
});
// #endregion
// cf. https://williamandrewgriffin.com/how-to-download-from-azure-blob-storage-with-streams-using-express/
// cf. https://github.com/Azure-Samples/azure-sdk-for-js-storage-blob-stream-nodejs/blob/master/v12/routes/index.js
// cf. https://github.com/aheckmann/gm
const inputBlobName = 'test.jpg';
const inputContainerName = 'images';
const outputBlobName = 'test.png';
const outputContainerName = 'images';
// if the containers are the same you can of course reuse the same container client
var inputContainerClient = blobServiceClient.getContainerClient(inputContainerName);
var outputContainerClient = blobServiceClient.getContainerClient(outputContainerName);
var inputBlobClient = inputContainerClient.getBlockBlobClient(inputBlobName);
var outputBlobClient = outputContainerClient.getBlockBlobClient(outputBlobName);
async function processFile() {
const downloadBlockBlobResponse = await inputBlobClient.download(0);
var inputStream = downloadBlockBlobResponse.readableStreamBody;
// I've hardcoded a resize and EXIF removal here just to show how the
// fluent interface works here.
gm(inputStream, inputBlobName)
.resize(200)
.noProfile()
.stream((err, stdout, stderr) => {
outputBlobClient.uploadStream(stdout);
});
}
processFile();
context.res = {
// status: 200, /* Defaults to 200 */
};
}
On the command bar at the top, select Save to save the code changes. You can't quite test this yet, because we haven't gotten the executables up to Azure yet, nor configured the Node.js packages. We'll do that now.
First, let's get that folder up to the filesystem underlying the App Service hosting your Azure Function, and get the Node.js configuration sorted while we're at it. As explained in the first post, this post isn't about Azure deployment processes such as CI/CD pipelines, so we're going to do this all manually, but in production you would very likely use a more automated process.
Close the function pane using the close button to top right of the pane (not the browser close button):
This will take you back to the Function App pane. On the left, under Development Tools (you'll probably need to scroll down to get to it), select Advanced Tools, then Go ➡:
You can also use Search (Ctrl+/) at the top of the left-hand side to search for this. Either way, this will bring you in to the main Kudu window. Kudu is the open-source engine behind git deployments in Azure App Service as well as a development support tool in its own right. In this case, we're going to use it to do some file uploading and some basic filesystem work.
Note that technically, I'm not following best practices here. We document that package.json should live in the root folder (not in an individual function), and we'll automatically do an npm install when you deploy from source control (but not in the Portal like we did here). But... say it with me... not production! The focus here is the external executable bit!
At this point, your function is deployed, with appropriate modules in the installation. Let's test it.
Congratulations, the Function App is calling the executable successfully!
So, what next here, if you were doing this for real? Well, proper CI/CD of course... and the storage account information should be in the Application settings for the function, ideally referencing Key Vault (notice the commented out code that references the process environment for this information)... but that's out of scope for the already long pair of blog posts.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.