Forum Discussion

RobCaron's avatar
RobCaron
Icon for Microsoft rankMicrosoft
Dec 02, 2017

Today on Azure Friday: Durable Functions in Azure Functions

Chris Gillum joins Scott Hanselman to discuss a new extension of Azure Functions known as Durable Functions. Durable Functions is a programming model for authoring stateful, reliable, and serverless function orchestrations using C# and async/await.

 

 

For more information, see: 

5 Replies

  • Ivan Buzyka's avatar
    Ivan Buzyka
    Copper Contributor

    Just few things that should be done before durable functions can be used (at least created in the Portal): https://markheath.net/post/create-durable-functions-azure-portal 

     

    Moreover, the documentation currently is pretty poor. For example, I cannot understand how I can upgrade my existing function that is triggered by BlobTrigger (as soon as new file is uploaded to the specific container) and does basically some computation of the contents of uploaded CSV file. That is a good candidate for rewriting to the durable fan-out fan-in function. So, challenge for me is to understand how to trigger orchestrator by BlobTrigger.

     

    Trying to find the way, appreciate any information :)

    • Chris Gillum's avatar
      Chris Gillum
      Icon for Microsoft rankMicrosoft

       

      You can use the samples in the GitHub repository as a reference. For example, there are two sample functions that show how you can trigger new orchestration instances:

       

      HttpTriggerhttps://github.com/Azure/azure-functions-durable-extension/blob/master/samples/csx/HttpStart/run.csx

      ManualTriggerhttps://github.com/Azure/azure-functions-durable-extension/blob/master/samples/csx/ManualStart/run.csx

       

      From these samples, you should be able to figure out how to use other trigger types. Here is an example of how you might start an orchestration from a blob trigger:

       

      run.csx

       

      #r "Microsoft.Azure.WebJobs.Extensions.DurableTask"
      
      using System;
      
      public static async Task Run(string blobData, DurableOrchestrationClient starter, TraceWriter log)
      {
          log.Info("Starting orchestration.");
          string instanceId = await starter.StartNewAsync("MyOrchestrator", blobData);
          log.Info($"Started orchestration with ID = '{instanceId}'.");   
      }

      function.json

      {
        "bindings": [
          {
            "type": "blobTrigger",
            "direction": "in",
            "name": "blobData"
          },
          {
            "name": "starter",
            "type": "orchestrationClient",
            "direction": "in"
          }
        ],
        "disabled": false
      }
      • Ivan Buzyka's avatar
        Ivan Buzyka
        Copper Contributor

        Hi Chris,

         

        First of all thank you for your reply. I am very excited to hear directly from you :)

        I managed to get BlobTrigger working.

         

        However, I got another exception when trigger started:

         

         

        Exception while executing function: DurableEveDataAggregationEntryPoint
        
        Microsoft.Azure.WebJobs.Host.FunctionInvocationException : Exception while executing function: DurableEveDataAggregationEntryPoint ---> Newtonsoft.Json.JsonSerializationException : Error getting value from 'ReadTimeout' on 'Microsoft.Azure.WebJobs.Host.Blobs.WatchableReadStream'. ---> System.InvalidOperationException : Timeouts are not supported on this stream.
        
           at System.IO.Stream.get_ReadTimeout()
        
           at Microsoft.Azure.WebJobs.Host.Blobs.DelegatingStream.get_ReadTimeout() at C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Blobs\DelegatingStream.cs : 53
        
           at lambda_method(Closure ,Object )
        
           at Newtonsoft.Json.Serialization.ExpressionValueProvider.GetValue(Object target) 
        
           End of inner exception
        
           at Newtonsoft.Json.Serialization.ExpressionValueProvider.GetValue(Object target)
        
           at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.CalculatePropertyValues(JsonWriter writer,Object value,JsonContainerContract contract,JsonProperty member,JsonProperty property,JsonContract& memberContract,Object& memberValue)
        
           at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeObject(JsonWriter writer,Object value,JsonObjectContract contract,JsonProperty member,JsonContainerContract collectionContract,JsonProperty containerProperty)
        
           at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.SerializeValue(JsonWriter writer,Object value,JsonContract valueContract,JsonProperty member,JsonContainerContract containerContract,JsonProperty containerProperty)
        
           at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize(JsonWriter jsonWriter,Object value,Type objectType)
        
           at Newtonsoft.Json.JsonSerializer.SerializeInternal(JsonWriter jsonWriter,Object value,Type objectType)
        
           at Newtonsoft.Json.JsonSerializer.Serialize(JsonWriter jsonWriter,Object value)
        
           at DurableTask.Core.Serializing.JsonDataConverter.Serialize(Object val…

        That happens on both local runtime when debugging as well as on Azure when deployed.

         

        As far as I understood the object of type Stream cannot be serialized in order to be passed to the orchestrator. It is very possible that I do something wrong :) Therefore, I hope you can give an advice.

        See that primitive example that I am trying to run here https://github.com/ivanbuzyka/Azure.DurableFunctions.BlobTriggerExperiment/blob/master/DurableEveDataAggregator.cs

         

        P.S. As soon as I get that thing working, I promise to share an example and blog a bit about that

         

         

         

Resources