Today on Azure Friday: Durable Functions in Azure Functions


Chris Gillum joins Scott Hanselman to discuss a new extension of Azure Functions known as Durable Functions. Durable Functions is a programming model for authoring stateful, reliable, and serverless function orchestrations using C# and async/await.



For more information, see: 

5 Replies

Just few things that should be done before durable functions can be used (at least created in the Portal): 


Moreover, the documentation currently is pretty poor. For example, I cannot understand how I can upgrade my existing function that is triggered by BlobTrigger (as soon as new file is uploaded to the specific container) and does basically some computation of the contents of uploaded CSV file. That is a good candidate for rewriting to the durable fan-out fan-in function. So, challenge for me is to understand how to trigger orchestrator by BlobTrigger.


Trying to find the way, appreciate any information :)



You can use the samples in the GitHub repository as a reference. For example, there are two sample functions that show how you can trigger new orchestration instances:





From these samples, you should be able to figure out how to use other trigger types. Here is an example of how you might start an orchestration from a blob trigger:




#r "Microsoft.Azure.WebJobs.Extensions.DurableTask"

using System;

public static async Task Run(string blobData, DurableOrchestrationClient starter, TraceWriter log)
    log.Info("Starting orchestration.");
    string instanceId = await starter.StartNewAsync("MyOrchestrator", blobData);
    log.Info($"Started orchestration with ID = '{instanceId}'.");   


  "bindings": [
      "type": "blobTrigger",
      "direction": "in",
      "name": "blobData"
      "name": "starter",
      "type": "orchestrationClient",
      "direction": "in"
  "disabled": false

Hi Chris,


First of all thank you for your reply. I am very excited to hear directly from you :)

I managed to get BlobTrigger working.


However, I got another exception when trigger started:



Exception while executing function: DurableEveDataAggregationEntryPoint

Microsoft.Azure.WebJobs.Host.FunctionInvocationException : Exception while executing function: DurableEveDataAggregationEntryPoint ---> Newtonsoft.Json.JsonSerializationException : Error getting value from 'ReadTimeout' on 'Microsoft.Azure.WebJobs.Host.Blobs.WatchableReadStream'. ---> System.InvalidOperationException : Timeouts are not supported on this stream.

   at System.IO.Stream.get_ReadTimeout()

   at Microsoft.Azure.WebJobs.Host.Blobs.DelegatingStream.get_ReadTimeout() at C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Blobs\DelegatingStream.cs : 53

   at lambda_method(Closure ,Object )

   at Newtonsoft.Json.Serialization.ExpressionValueProvider.GetValue(Object target) 

   End of inner exception

   at Newtonsoft.Json.Serialization.ExpressionValueProvider.GetValue(Object target)

   at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.CalculatePropertyValues(JsonWriter writer,Object value,JsonContainerContract contract,JsonProperty member,JsonProperty property,JsonContract& memberContract,Object& memberValue)

   at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeObject(JsonWriter writer,Object value,JsonObjectContract contract,JsonProperty member,JsonContainerContract collectionContract,JsonProperty containerProperty)

   at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeValue(JsonWriter writer,Object value,JsonContract valueContract,JsonProperty member,JsonContainerContract containerContract,JsonProperty containerProperty)

   at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize(JsonWriter jsonWriter,Object value,Type objectType)

   at Newtonsoft.Json.JsonSerializer.SerializeInternal(JsonWriter jsonWriter,Object value,Type objectType)

   at Newtonsoft.Json.JsonSerializer.Serialize(JsonWriter jsonWriter,Object value)

   at DurableTask.Core.Serializing.JsonDataConverter.Serialize(Object val…

That happens on both local runtime when debugging as well as on Azure when deployed.


As far as I understood the object of type Stream cannot be serialized in order to be passed to the orchestrator. It is very possible that I do something wrong :) Therefore, I hope you can give an advice.

See that primitive example that I am trying to run here


P.S. As soon as I get that thing working, I promise to share an example and blog a bit about that





Just updated the code, so the orchestrator gets the name of the triggering blob from orchestration client and the access to the Stream of that blob was organized by using BlobContainer binding and that blob name. Here is the commit:


I hope that looks better.


Thanks! Yes, only JSON-serializeable types can be used as arguments to orchestrator or activity functions. I think that explains the exception you were seeing earlier. Passing the blob name instead of the stream is the right way to go.


One thing I noticed is that you are calling blob storage directly from your orchestrator function. We actually discourage doing any I/O in an orchestrator function, and instead suggest that you do I/O (such as calling into blob storage) only from activity functions. See this documentation for more information on why: (especially the part about code constraints).