First published on MSDN on Dec 20, 2017
Authored by Andreas Helland
I know, the title is a mouthful, but I wasn't able to whittle it down without losing context. (Sure, I could have gone the clickbait route I suppose.) Let me setup the scenario for you :)
Let's say you have an IoT lab like described in my previous post:
https://blogs.msdn.microsoft.com/azuredev/2017/12/13/controlling-your-iot-home-with-azure-functi...
The Hue devices let you query for status, and it will return a json response to you. Something like this:
[code language="csharp"]
{
"on":true,
"sat":254,
"bri":254,
"hue":10000
}
[/code]
Well, that is nice and dandy right? Maybe we want to store that somewhere, and use it later. The easy thing to do is store the json document into Azure Data Lake Store, or Azure CosmosDB. But what if we want to act upon the contents as well?
The whole querying business is handled quite nicely by an Azure Function as already demonstrated. Now, we could bake all kinds of logic into this Function to parse through the returned json, but it might not be an ideal way of building things. It might be better if we send that response off to somewhere else for actually acting upon it (if needed), and send off to storage afterwards.
Since "assembling the bigger picture pipeline" is not in focus here I'll jump to the conclusion as to what components will be involved. We push the json to an Azure Event Hub where it can be picked up by a Stream Analytics job and be processed further. (Initiate an action or just store it somewhere.) We will assume for the sake of the argument that limits on the payload you want to send are a non-issue as well. Event Hubs are good at receiving data, and Stream Analytics is a decent dispatcher.
Step 0
Starting simple you look up the requirements for authenticating with Event Hubs, and subsequently attempt to push a simple json document. It's http so that part is easy, but you need to use a static key and build a signature to authenticate. You might end up with some code like this (I ran this in an Azure Function):
[code language="csharp"]
using System;
using System.Text;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Security.Cryptography;
public static void Run(string input, TraceWriter log)
{
var resourceUri = "contoso.servicebus.windows.net";
var keyName = "RootManageSharedAccessKey";
var key = "randomkey";
TimeSpan sinceEpoch = DateTime.UtcNow - new DateTime(1970,1,1);
var week = 60 * 60 * 24 *7;
var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + week);
string stringToSign = Uri.EscapeDataString(resourceUri) + "\n" + expiry;
HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(key));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
var sasToken = $"sr={Uri.EscapeDataString(resourceUri)}&sig={Uri.EscapeDataString(signature)}&se={expiry}&skn={keyName}";
var eventHubUrl = "
https://contoso.servicebus.windows.net/foo/messages";
var content = "{\"on\":true, \"sat\":254, \"bri\":254, \"hue\":10000}";
HttpClient Client = new HttpClient();
Client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("SharedAccessSignature", sasToken);
var foo = Client.PostAsync(eventHubUrl, new StringContent(content.ToString())).Result;
log.Info($"result: {foo}");
}
[/code]
This does what it is supposed to, but do you like the looks of it? It's possibly not the prettiest Function you ever seen. That's not the main issue with it though - what if I were to move this into something other than a Function? Where I don't want to hardcode these keys and rely on this encoding logic?
Not to mention that I have been preaching Azure AD as the identity of choice for a long time, and this simply doesn't align with that. To be clear - I'm not saying the design of Event Hubs is wrong; it is designed to either be used back-end, or in scenarios where you want to ingest as much as possible per second, and not waste time. The same goes for other resources in Azure as well. If you want to upload files to Azure Storage you also see this pattern with using SharedAccessTokens (SAS).
So to leverage these Azure resources front-end we need to have an abstraction handling this, and have the SAS-token thing be more on the back-end. Cue Azure API Management…
Now, Azure API Management (APIM) isn't a new feature, it's been around for a long time. It's not like I've invented the wheel or anything like that. Like any API Gateway solution worth its money it lets you manipulate the http request on the inbound before passing it on to the back-end. For our purpose this is done through custom policies. (I will assume for the remainder of these steps that you have already created an instance of API Management. If not => run off, do that, and come back in twenty minutes or so when it has finished.)
Step 1
First let's add the API:
You can leave most fields empty.
Add a POST operation (the url matches up with suffixing "/messaging" to the name of our event hub ("foo").
Head to the "Design" section and hit the little down symbol in the "Inbound processing" area.
Type up a "set-header" policy like this (text for copy-paste later down the page):
This is basically the same code as in our Function, but we don't do the actual POST; we just have the calculated sasToken returned from the policy. What it does then is to replace the Authorization header in the request with a correct token for authenticating with the Event Hub. Neat!
Head over to the Test tab to verify this. Notice that you need to add the header, but you can set it to anything you like.
Hopefully you will get a HTTP 201 response if everything was okay. (If not use the "Trace" to see if you can track it down. It's usually an incorrect url or key.)
This solves part one of our problem. We don't need the SAS Token bits in our Function. Did I not mention part two?
Step 2
Thing is, we have now moved auth to the back-end, but more or less removed it on the front-end of the API interface. Sure, you need to have a subscription key, but if you have that you can send off anything you like. And we don't really want that. We still want to protect our API, but we want to accept JWT Tokens instead of SAS Tokens. So what we do is to add a new section in our API policy. This is a "validate-jwt" element that we need to place above our current policy:
APIM is nice enough to validate the sections in order so it will first validate the JWT, and if the JWT is ok, then it will replace the
Authorization
header.
I'm only doing simple validation here, so as long as the token is issued by the common endpoint in Azure AD with the management.core.windows.net it will work. (The audience would be correct for the token you get when signing into the Azure Portal, but the openid-configuration url is dependent on your sign-in mechanism.) You will probably need to adjust these parameters to make sure they are correct for your environment. If you want to go the quick and dirty route login to the Azure Portal and have Fiddler running in the background, and snatch the token from there. If you create a separate app in Azure AD with a corresponding id and secret the audience will be the app URI.
Anywho, the complete policy looks roughly like this:
[code language="csharp"]
<inbound>
<validate-jwt header-name="Authorization" failed-validation-httpcode="401" failed-validation-error-message="Unauthorized">
<openid-config url="
https://login.microsoftonline.com/common/.well-known/openid-configuration" />
<audiences>
<audience>
https://management.core.windows.net/</audience>
</audiences>
<required-claims />
</validate-jwt>
<set-header name="Authorization" exists-action="override">
<value>@{
var resourceUri = "contoso.servicebus.windows.net";
var keyName = "RootManageSharedAccessKey";
var key = "randomKey";
TimeSpan sinceEpoch = DateTime.UtcNow - new DateTime(1970, 1, 1);
var week = 60 * 60 * 24 * 7;
var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + week);
string stringToSign = Uri.EscapeDataString(resourceUri) + "\n" + expiry;
HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(key));
var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));
var sasToken = String.Format("SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}", Uri.EscapeDataString(resourceUri), Uri.EscapeDataString(signature), expiry, keyName);
return sasToken;
}
</value>
<!-- for multiple headers with the same name add additional value elements -->
</set-header>
<set-backend-service id="apim-generated-policy" base-url="
https://contoso.servicebus.windows.net" />
</inbound>
[/code]
Step 3
Moving back to Azure Functions we can rework our logic there accordingly. Maybe something like this instead:
[code language="csharp"]
using System;
using System.Text;
using System.Net.Http;
using System.Net.Http.Headers;
public static void Run(string input, TraceWriter log)
{
log.Info($"C# manually triggered function called with input: {input}");
var apimUrl = "
https://contosio.azure-api.net/foo/messages";
var content = "{\"on\":true, \"sat\":254, \"bri\":254, \"hue\":10000}";
var AADToken = "token";
HttpClient Client = new HttpClient();
Client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", AADToken);
Client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key","subKey");
var foo = Client.PostAsync(apimUrl, new StringContent(content.ToString())).Result;
log.Info($"result: {foo}");
}
[/code]
Doesn't this look slightly better?
Ah, but Andreas, you omitted the token part here too you sneaky bastard. For this to work I have to acquire the token manually and paste it in, which is quite the hassle. Yes, I did skip that part. That was to focus on the API call first. You can add some extra code to acquire the token inside the Function, and pass that along.
Step 3.5
A complete working Function would look like this:
[code language="csharp"]
using System;
using System.Text;
using System.Net.Http;
using System.Net.Http.Headers;
using Newtonsoft.Json;
public static void Run(string input, TraceWriter log)
{
log.Info($"C# manually triggered function called with input: {input}");
var apimUrl = "
https://contosio.azure-api.net/foo/messages";
var content = "{\"on\":true, \"sat\":254, \"bri\":254, \"hue\":10000}";
var AADToken = getToken().Result;
HttpClient Client = new HttpClient();
Client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", AADToken);
Client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key","subKey");
var foo = Client.PostAsync(apimUrl, new StringContent(content.ToString())).Result;
log.Info($"result: {foo}");
}
public static async Task<string> getToken()
{
var domain = "contoso.onmicrosoft.com";
var clientId = "id";
var clientSecret = "secret";
var resource = "app uri";
HttpClient client = new HttpClient();
string requestUrl = $"
https://login.microsoftonline.com/{domain}/oauth2/token";
string request_content = $"grant_type=client_credentials&resource={resource}&client_id={clientId}&client_secret={clientSecret}&scope=openid";
HttpRequestMessage request = new HttpRequestMessage(HttpMethod.Post, requestUrl);
try
{
request.Content = new StringContent(request_content, Encoding.UTF8, "application/x-www-form-urlencoded");
}
catch (Exception x)
{
var msg = x.Message;
}
HttpResponseMessage response = await client.SendAsync(request);
string responseString = await response.Content.ReadAsStringAsync();
GenericToken token = JsonConvert.DeserializeObject<GenericToken>(responseString);
var at = token.access_token;
return at;
}
internal class GenericToken
{
public string token_type { get; set; }
public string scope { get; set; }
public string resource { get; set; }
public string access_token { get; set; }
public string refresh_token { get; set; }
public string id_token { get; set; }
public string expires_in { get; set; }
}
[/code]
I can hear the snide remarks already - "What a genius this guy is - we started with a 30 line Function, and ended up with a 60 line Function + a complicated script in API Management." Uh, yes, I can see that one.
As always the small code snippets I show can come across as a little bit convoluted, not to mention it's a veritable over-engineered effort. For one minor data reporting Function like this it doesn't make sense. But I would say that architecturally it is more sound to abstract the SAS token authentication away from the front-end and towards the back-end resource, and correspondingly keep the OAuth-based flows on the client side. (With the standard disclaimer that this might not apply to your specific scenario, and don't hold me responsible if this isn't the right technique for you.) There is also the minor snag of how to validate jwts issued by different IdPs, with different scopes, etc. You might have to tackle those beasts as well before going live in production. However I believe we've already reached the limit of what we are able to digest for now :)