Blog Post

Modern Work App Consult Blog
12 MIN READ

Build a Viva Connections card to display stock prices - Part 3: deployment and security

Matteo Pagani's avatar
Matteo Pagani
Icon for Microsoft rankMicrosoft
Mar 22, 2023

Welcome to the final post in the series about building a Viva Connections card to display stock prices! After building the backend in the first post and the card in the second post, now it's time to deploy our solution and to take a deeper look at the security of our API.

 

Let's start with the deployment of the backend!

Deploying the backend

Our backend is made by an Azure Function and an Azure Cache for Redis instance. The easiest way to deploy these components is to use a template, so that we don't have to manually create all the services and configure them. On the GitHub repository of the project, you can find a Bicep template. Bicep is a powerful language that you can use to deploy Azure resources and define dependencies, properties, etc.

 

The template defines the following resources:

  • A web app hosting plan, using the Y1 SKU, which hosts the Azure Function.
  • An Azure Function app, hosted on Linux and which runs the .NET runtime. The function already includes, in the application settings, the properties we have set in the first post to manage the cache, like CacheConnection and CacheExpirationInHours.
  • An Azure Storage instance, which is required by the Azure Function to work since it's where all the logs are stored.
  • An Azure Cache for Redis instance. Thanks to the Bicep syntax, we can specify that the Azure Function is dependent on the cache, so that it will be created first. This will enable the URL assigned to the instance to be automatically injected into the CacheConnection setting in the Azure Function.
  • An Azure Application Insights instance, which is used to collect analytics and errors on the Azure Function.

The easiest way to deploy the Bicep template is to use Visual Studio Code, by installing the dedicated extension. Once you have installed it, click on the Azure icon on the left sidebar and make sure to login with an account which has the permission to create resources on your Azure subscription.

 

Now open the main.bicep file with Visual Studio Code. Press CTRL-SHIFT-P and, from the command palette, choose Bicep: Deploy Bicep file. You will be asked for a series of information:

  1. The name of the deployment (you can leave the default one).
  2. The target subscription in which to deploy the resources.
  3. The resource group where to include the resources (it’s suggested to create a new one dedicated to this project).
  4. The Azure region where to deploy the resources.

Then you will be asked if you want to provide a parameter file with the settings of the resources. We won’t use one, but we’ll provide the information as part of the deployment. As such, pick None. This will trigger a series of additional questions:

  1. The name to assign to the Azure Cache for Redis service. You can use the first option to automatically generate one or you can specify a custom one.
  2. The name to assign to the Azure Function. You can use the first option to automatically generate one or you can specify a custom one.
  3. The Azure region in which to deploy the resources. Use the default value (resourseGroup().location) to create the resources in the same region you have chosen to host the resource group.
  4. The runtime used by the Azure Function. Leave the default value, which is dotnet.

As the last question, you will be asked if you want to create a parameters file with the values you have just specified. Choose Yes if you’re planning to repeat the process in the future, otherwise say No. At this point, the deployment process will start. In the Output window of Visual Studio Code, you will see a URL that you can open to check the status of the deployment:

 

 

When you open the URL, you will be redirected to the Azure portal to see the deployment in action:

 

 

Most of the resources will be deployed very quickly, while the Azure Cache for Redis service will require a while. Don’t worry, it’s expected. Once the deployment is completed, you will find in the resource group you have chosen all the resources which are required to host the solution.

 

Deploying the Azure Function

The previous step has created on Azure all the services which are required to host our solution, but it didn’t deploy the solution itself. Since we have built the Azure Function with Visual Studio 2022, we can use the built-in Publish feature to deploy it. Right click on the project and choose Publish. Click New to start the wizard to generate a publish profile:

  • Target: choose Azure.
  • Specific target: choose Azure Function App (Linux).
  • Functions instance: pick your Azure subscription and look for the Azure Function instance that you have just created with the Bicep template. - Deployment type: for this article, we'll choose Publish (generates pubxml file). However, if you want a better long-term solution, you can choose CI/CD using GitHub Actions, which will generate a workflow for GitHub Actions that can automatically build and deploy an updated version of the Azure Functions every time you commit changes to your code.

Once the process is completed, Visual Studio will display an overview of the deployment:

Click on Publish to start the deployment process. Once the operation is completed, you should be all set. To test it, try to use Postman or open your browser against the URL https://<your-function-url/api/stockPrices/MSFT and you should see the JSON with the latest snapshot about the Microsoft stock.

 

Before using the API in our card, there's an extra step we must follow. Do you remember that, in the previous post, we had to enable CORS in the local Azure Functions configuration file so that the Viva Connections dashboard would have been able to reach the API? We must do the same also now that we have published the function on Azure, since the domain which hosts it is different from SharePoint's domain. Go to the Azure portal, click on the Azure Function you have just deployed and look for the CORS option in the left panel, under the API section. Under the allowed origins section, add the URL of your SharePoint website, like in the following example, then hit Save:

 

 

Please note: if you have followed the first post step by step and you have already created an Azure Cache for Redis instance, you can skip using the Bicep template. In this case, however, you will have to:

  • In the Publish wizard in Visual Studio, instead of picking an existing Azure Functions instance, you must click on Create New and follow the wizard to create a new instance.
  • Once the Azure Function is up & running, you must open it in the Azure portal, click on Configuration and, under Application Settings, add two keys: CacheConnection, with the full connection string to the Azure Cache for Redis instance, and CacheExpirationInHours, with the amount of time in hours that must pass before the cache is invalidated.

Deploy the Viva Connections card

Viva Connections cards are considered SharePoint applications, so they are deployed through the SharePoint administrator center. The first step is to generate a package. Open Visual Studio Code on your card project, click on Terminal → New terminal and run the two following commands:

 

gulp bundle --ship

gulp package-solution --ship

The first command will package all the required JavaScript files into a single bundle. The second one will generate a file with .sspkg extension under the sharepoint folder of your project. This is the package we must upload to the SharePoint admin center. To access the admin center, you must open your browser on the URL https://<your-sharepoint-tenant>-admin.sharepoint.com/. For example, if your company is called Contoso, the URL will be https://contoso-admin.sharepoint.com/.

 

Please note: you must have administrator rights on SharePoint to be able to access the portal and perform the next steps.

From the panel on the left, choose More features and click the Open button under Apps:

 

 

Once you are in the App Catalog, click on Upload and look on your computer for the file with .sspkg extension that you have previously created. Once the file has been uploaded, you can choose to just enable the application or to also add it automatically to all the existing SharePoint sites. Pick up the option that makes more sense for your scenario. The second one makes your work easier, because the card will be already available when you start configuring your Viva Connections dashboard; otherwise, you will need to manually add it to the SharePoint site which hosts your dashboard.

 

 

That's it. Now if you head to the SharePoint website which hosts your Viva Connections Dashboard, click on Manage Viva Connections and enter editing mode, you will see your Stocks card among the available ones you can add to the dashboard:

 

 

You will also have the option, by clicking on the pencil icon, to open the property panel and customize the properties we have defined in the second post, like the Azure Functions URL and the stock symbol to track.

 

Secure our backend

If you followed the article so far, you will have a working card on your Viva Connections dashboard displaying the stock price of a company. However, as we anticipated in the previous post, the current implementation has a security flaw. The Azure Function URL is public, and it isn't protected in any way. Anyone with the full URL of our backend would be able to use the service we've built in their solutions and projects. From a security perspective, it might not be a big problem: in the end, we aren't returning any sensitive information; it might be, however, from a cost perspective, because we could get unexpected expensive bills from Azure or Alpha Vantage (in case we opted in for a paid plan) because someone else abused our API.

 

The best way to protect our API in our scenario is to use the Microsoft Identity platform. Since our card is hosted on SharePoint, which is a platform that requires the user to be authenticated with their work account, we can leverage single sign-on. We won't have to ask the user to login to use the API, since they are already authenticated. There are two changes we must make to support this scenario:

  • One in the backend, to protect the API. We won't have to change the code, since we'll leverage the built-in Authentication feature in Azure.
  • One in the card, to make authenticated calls against the protected API. The HTTP request must include the Authorization header with a valid token.

Let's see step by step the changes to make.

 

Secure the Azure Function

As mentioned, we won't have to change the code of our Azure Function, since we're going to use the built-in authentication support provided by Azure. Open the Azure portal, go to the Azure Function you have previously deployed and, in the left sidebar, click on Authentication:

 

 

In the page, choose Add identity provider and pick Microsoft from the list. This process will start a wizard that creates a new app registration on Azure Active Directory, which will be used to support SSO. Make sure to set the following options:

 

  1. Under App registration type, choose Create new app registration.
  2. Under Name, call it Stock Prices.
  3. Under Supported account types, choose Current tenant – Single tenant. This will ensure that only our employees will be able to use the API.

Then configure the App Service authentication settings section with the following settings:

  1. Under Restrict access, choose Require authentication.
  2. Under Unauthenticated requests, choose HTTP 401 Unauthorized.

 

Click Add to complete the authentication process. Our Azure Function is now protected and, if you try to hit again the URL with Postman or your browser, you will get back a 401 Unauthorized error.

Please note: the previous guidance assumes that your Azure Active Directory tenant is hosted on the same domain as your SharePoint tenant. In case they are hosted on different domains, you must create the AAD app registration manually and supply the various information (application id, tenant id, client secret) to the Azure Functions configuration. You will also need to choose the option Any Azure AD directory - Multi tenant under Supported account types.

To use the AAD app we have just registered to protect our API, we must make a final change in the app configuration. Still in the Azure portal, go to the Azure Active Directory section and, from the sidebar, choose App registrations. Look for the application called Stock Prices that you have just created and go to the Expose an API section. Click on Add a scope and use the following settings:

  • Under Scope name, type access_as_user.
  • Under Who can consent?, choose Admins and users.
  • Under Admin content display name, type Access the API on behalf of a user.
  • Under Admin consent description, type Access the API on behalf of a user.
  • Make sure State is set to Enabled.

 

Click Save to complete the process. Now our API is protected, so we can move on and make the required changes to the card.

 

 

Secure the card

From a data manipulation perspective, the code of our card doesn't change. We still need to perform a HTTP GET to our API and parse the JSON we receive back with the stock data about the company we have chosen. The difference, however, is the way we perform the HTTP GET, since now we must include inside the request a valid token to authenticate ourselves. Luckily, the SharePoint Framework makes this operation easy: when employees are using Viva Connections (or visiting any SharePoint site), they're already logged in with their work identity, so SPFx includes a special HTTP client which is already authenticated through single sign-on. We just need to use it instead of the traditional fetch() API. First, let's add the following statement at the top of our StocksAdaptiveCardExtension.ts file:

 

import { AadHttpClient } from '@microsoft/sp-http';

Then, let's change the fetchData() function as following:

 

public async fetchData(): Promise<void> { 

    const url = this.properties.apiUrl + "/" + this.properties.stockSymbol;

    const aadClient = await this.context.aadHttpClientFactory.getClient("api://c94af769-ef30-4d37-a660-b602aab24d5b");
    let response = await aadClient.get(url, AadHttpClient.configurations.v1);

    const json = await response.text();
    const parsedJson: IStockPrice = JSON.parse(json) as IStockPrice;

    this.setState({stock: parsedJson});
}

We use the getClient() function, exposed by this.context.aadHttpClientFactory, to get an authenticated client. As parameter, we must pass the special URL api:// followed by the app id of the AAD app that we have previously registered. We can find this URL under the Expose an API section of our AAD app:

 

 

 

Once we have an authenticated client, we perform the HTTP GET request by calling the get() method and passing, as parameters, the URL of our Azure Function (which is stored in the properties) and the version to use, through the constant AadHttpClient.configurations.v1.

This is the only change we must make. The rest of the code is the same: we take the response, turn it into a string and then parse it as an IStockPrice object.

The final change we must make is in the manifest that is generated when we create the package to upload on the SharePoint admin portal. We must declare, in fact, that our card is using an external API. You can find this configuration in the package-solution.json file under the config folder. Make sure to add, under the solution property, the following entry:

 

json
 

{
  "$schema": "https://developer.microsoft.com/json-schemas/spfx-build/package-solution.schema.json",
  "solution": {
    "webApiPermissionRequests": [
      {
        "resource": "Stock Prices",
        "scope": "access_as_user"
      }
    ]
  }
}

The value of the resource property must match the name we have assigned to the app registration we have created in our AAD tenant.

 

The last step is to repeat the two gulp based commands we have previously seen to generate a new .sspkg package and upload it on the Apps section of the SharePoint admin portal. This time, after having chosen if we want to only enable the app or to automatically add it to all the sites, we will see a new prompt:

 

 

We must approve API access for our card, otherwise it won't be able to perform authenticated requests. Click on Go to API Access page, in which you will find a pending request:

 

 

Click on Approve to complete the process.

 

That's it. Now if you add the card again to your Viva Connections dashboard, everything should continue to work as before. The difference, however, is that the API is now protected and only your card will be able to retrieve the stock data. Any user outside the company, even if they know the URL of your Azure Function, won't be able to access the data: they will just get a 401 Unauthorized response.

 

Please note: the project that you will find on the GitHub repository supports both authenticated and unauthenticated requests, through a new property added to the panel called AAD App Id. When you add the card to the dashboard, you can use this property to specify the App ID of the AAD application you have created to support single sign-on. If the property is set, the fetchData() function will use the AadHttpClient to perform the authenticated request; if the property is left empty, instead, it will use the standard unauthenticated fetch() API.

Wrapping up

In this final post of the series, we have learned how to take our stock card for Viva Connections in production, by deploying the backend on Azure and by uploading the card as a SharePoint app in the administrator portal. We have also learned how to protect our API, which is optional but highly suggested to avoid unauthorized users abusing our backend.

 

Everything we have learned so far is available on GitHub. The repository includes the Azure Function (in the function folder), the card (in the card folder) and the Bicep template to deploy the Azure resources (again, in the function folder).

 

Happy coding!

Published Mar 22, 2023
Version 1.0
No CommentsBe the first to comment