For the third and final part of this short series on reporting against Planner data I will be taking some of the ideas from Part 1 and Part 2 and putting more process around them. This is still really just ideas to make you think about your use of Planner – and not a baked solution. I don’t for example handle the messy stuff of keeping everything in sync – and take an approach of loading a snapshot each time. Also the tools I’m using may not be the best ones for you – but hopefully highlight the open nature of Graph and also showcase the parts that Flow and many of the Azure workloads can play in an overall solution.
This is one of the outcomes of the blog – just to keep you reading… A Power BI report of the status of Planner tasks – for a specific set of Plans.
For the session at our internal Ready conference that sparked this series I used Python for some of the Graph reading – based on this sample - https://github.com/microsoftgraph/python3-connect-rest-sample which was a web app using Flask the that started the authentication and then I just made the necessary changes to the requested permission scopes then made the various Graph calls and manipulated the response. I finally used the Azure DocumentDB Data Migration tool to push the data into Azure Cosmos DB. It felt a little clunky.
Since then I’ve re-worked things and moved back from Python to PowerShell (mainly due to my competencies – still learning Python – come to that still learning PowerShell!) and combined Flow and Azure Functions together to move the data via REST calls into Azure Cosmos DB.
So to start – I have an Azure Cosmos DB provisioned and a database created called Planner. I decided to store data in various collections – matching up to the data elements in Planner – with the addition of an extra one for Assignments – pulling this outside the Task for easier reporting. I could have saved all the different json into just one collection as CosmosDB is a NoSQL store – but again for reporting and hitting it with PowerBI the collections made it easier. In my testing I was re-loading the data many times – so I made things easier by using the Cloud Shell and a script to delete and create the collections (first time I’d used vi for a while…).
The next part was the Azure Functions – which read all the Planner data using the Graph API and then write the data to CosmosDB. Initially I did this in one function, but even with just 30 or so plans I was getting close to the function time-out of 5 minutes so changed it so that the initial function reads most of the plan level data and then makes another function call for each of the plans to get the task data. This could probably be improved to pass over the token information too – avoiding some re-authentication in the 2nd function.
You pay for Azure functions based on the time they are running and the resources they consume – and storage while they are not running. I’ve yet to hit $0.05 on my usage during my testing for the function costs.
The first function has some setup stuff for the CosmosDB REST calls, and the authentication key is created based on the verb, resource date and the master key – so these are needed for each collection. There is also some setup for the Planner Graph calls (including keys/accounts/passwords as this runs from the function) and then imports some modules for the Graph auth. These modules need to be loaded into the function – which is carried out using the right hand ‘View files’ window.
I’ll add the full script at the foot of the blog – just to not break the flow of this document too much and just walk through some key parts. In earlier version I had used the me/Planner/Plans Graph call to get the Plans, but there are some challenges with that call in that it only gets the subscribed plans – so I switched in this example and used the v1.0/Groups call (as the account I had set had access) and then used the groups/<GroupId>/planner/plans call. This has the additional advantage that it does get the potentially multiple plans per Group that can be created via Microsoft Teams – hence my use of a ForEach when reading the plans from the Group.
One other gotcha I hit when running from the function compared to just running in PowerShell on my desktop was seeing some ‘bad json’ messages. Adding a –Compress to the ‘ConvertTo-Json appeared to resolve these, and I suspect either trailing spaces or ‘\’ may have been the root cause – exposed by a different PowerShell version. I also wrapped the planner loop in a try/catch – as some groups may not have plans.
After looping around the plans, plan details and members, and writing each of the sets of data off to CosmosDB using a REST call like in the following screenshot I made the call to the second function – shown in the subsequent screenshot. You can also see in the PlanDetails section that this is where I am filtering the Plans and only pushing all the remaining data to CosmosDB if the 1st category is set to Marketing. This is a pretty rough way to filter – but it does the job in my case as all the plans I am interested in were created from a template where category 1 was marketing.
And here is the call to the subsequent function. I am passing in the Plan Id, and using Start-Job so this is an asynchronous call so I’m not waiting on a response. This also necessitates the use of $using so that the async call has access to the local variables. You probably want to add some kind of error handling here – or just keep an eye on function failures with an alert. Notice the –UseBasicParsing too – as this is required since the call is being made in the function and the Internet Explorer engine is not available:
The second function called above is very similar once we get past the initial code that gets the Plan ID from the request.
I proceed with the REST and Planner auth stuff after that (which perhaps could be passed in the request and save some time/money), and then one added complexity was pulling out the assignments into their own collection – so it is easier to look at assignments across plans.
In my case I’ve just executed the first function manually for now, but it would make sense to have this on a schedule – and you could also tie in the deletion/creation of collections to save on the costs between reporting cycles. If I hadn’t wanted to learn more about CosmosDB it would probably make more sense to push this data into SQL Azure – but the very cool multi geo replication possibilities with CosmosDB are certainly worth a look. I did start spending a bit more of my Azure allocation than I’d expected after creating collections with high throughput capabilities – and writing my data to Europe which then replicated to the US and Brazil. Lesson learned.
Monitoring the functions through the Azure Portal I can see that my current run of the initial function was successful and took just over 2 minutes. I left some of my earlier failures in there too – the debug capabilities in Functions are pretty good!
This of course was then calling out to the other function – so monitoring that one I also see success (just showing the last few here) and they were taking around 15 to 20 seconds a time.
Here is an example of one record – part of a task – and as you can see this is stored in json format,
As I get this far I’m thinking that a Part 4 of my trilogy will be needed to dig deeper into using Power BI directly against the CosmosDB – as for now I wanted to concentrate on another Flow. I have a scheduled Flow that runs every day and makes several calls to query documents in my CosmosDB and then creates and then updates an item in a SharePoint list.
The first Query documents looks like this – it is going to my Planner CosmosDB and mya Tasks collection and executing the query
SELECT Count(c.id) from c WHERE c.percentComplete=0
And you can test out this kind of query in the Azure portal agains the CosmosDB using the query explorer, and my answer currently is 73 tasks that have percentComplete of 0 – which means not started.:
The next step of the Flow is creating an item in SharePoint – and the step looks like the following, where I set the site address and the list name, then write a title (which I don’t actually use) and set the date to now and use an expression to write out my query result of 73.
The expression to get the value from my query was the tricky part for me – but finally this worked:
int(body('Query_documents')?['Documents']?[0]?['$1'])
The remaining steps are just a repeat of the query for 50% (In Progress) and 100% (Completed) and updating the task (using the ID) with the appropriate value.
Looking at my Flow run history I can see that I had a few failures – these were days when my collections didn’t exist. The two recent successes where just after I added my collections while writing this blog (result = 0 for all fields! – so I deleted this row in SharePoint) and the most recent successful one where it added my correct values.
In SharePoint my list looks like this:
I must admit to cheating here and adding a bit of history manually – so that the PowerBI reports would look good – but after pointing Power BI at my SharePoint list, removing columns I wasn’t interested in and ensuring that the remaining data was being seen as a Date field and 3 integers – I could finally get to the report I showed at the top of the blog – but now with the additional values from today (I’d added a plan and updated a few progress values earlier):
I’ll do an addendum or Part 4 of the trilogy to look at using the Power BI CosmosDB direct reporting options – but hopefully this has given you some ideas. I’ve probably skipped over some important point or other – let me know if anything needs some better explanation.
The PowerShell scripts used in the two functions follow – with some details edited where you will need to add your own accounts, keys and Urls etc. These are supplied as-is – no warranties. Bware of “ being turned curly and new lines where there shouldn’t be new lines.
The Initial Function
#Setup stuff for the DocDB REST calls
[System.Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null
$global:resourceType="docs"
$verb="POST"
$date = Get-Date
$utcDate = $date.ToUniversalTime()
$global:xDate = $utcDate.ToString('r',[System.Globalization.CultureInfo]::InvariantCulture)
$global:masterKey = "<Your master key from CosmosDB goes here>”
# generate signature for REST call
function fnGetKey ($pVerb, $pResourceId, $pResourceType, $date, $masterKey)
{
$keyBytes = [System.Convert]::FromBase64String($masterKey)
$sigCleartext = @($pVerb.ToLower() + "`n" + $pResourceType.ToLower() + "`n" + $pResourceId + "`n" + $date.ToString().ToLower() + "`n" + "" + "`n")
$bytesSigClear =[Text.Encoding]::UTF8.GetBytes($sigCleartext)
$hmacsha = new-object -TypeName System.Security.Cryptography.HMACSHA256 -ArgumentList (,$keyBytes)
$hash = $hmacsha.ComputeHash($bytesSigClear)
$signature = [System.Convert]::ToBase64String($hash)
$key = $('type=master&ver=1.0&sig=' + $signature)
$key = [System.Web.HttpUtility]::UrlEncode($('type=master&ver=1.0&sig=' + $signature)) # needs Snapin System.web!
return $key
}
#Setup stuff for the Planner Calls
$clientId = ‘<A registered client ID in Azure with the right access levels>’
$aadtenant = "<yourtenant>.onmicrosoft.com"
$username = "<yourname>@<yourtenant>.onmicrosoft.com"
$password = "<your password>" | ConvertTo-SecureString -AsPlainText -Force
$Credential = New-Object -typename System.Management.Automation.PSCredential -argumentlist $username, $password
$Modulebase = (get-Module MicrosoftGraphAPI).ModuleBase
Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.WindowsForms.dll"
$resourceAppIdURI = “
https://graph.microsoft.com”
$authority = “
https://login.windows.net/
$aadTenant”
$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority
$uc = new-object Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential -ArgumentList $Credential.Username,$Credential.Password
$token = $authContext.AcquireToken($resourceAppIdURI, $clientId,$uc)
# Blog Client ID - my Application ID from Azure Oops – dupe line…
$clientId = ‘<A registered client ID in Azure with the right access levels>’
#Get DocDB REST authkeys
$global:resourceId= "dbs/Planner/colls/Plans"
$planAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/PlanDetails"
$planDetailsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/Members"
$planMembersAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/Buckets"
$planBucketsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/Tasks"
$planTasksAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/TaskDetails"
$planTaskDetailsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/Assignments"
$planAssignmentsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
#################################################
#Get Plans (V2 - from Groups)
#################################################
$headers = @{}
$headers.Add('Authorization','Bearer ' + $Token.AccessToken)
$headers.Add('Content-Type', "application/json")
#$plans = Invoke-WebRequest -Uri "
https://graph.microsoft.com/v1.0/me/planner/plans"
-Method Get -Headers $headers -UseBasicParsing
#$plansContent = $plans.Content | ConvertFrom-Json
#$planValue = $plansContent.Value
#First get Groups
$groups = Invoke-WebRequest -Uri "
https://graph.microsoft.com/v1.0/Groups"
-Method Get -Headers $headers -UseBasicParsing
$groupsContent = $groups.Content | ConvertFrom-Json
$groupValue = $groupsContent.Value
ForEach($group in $groupValue){
try {
$uri = "
https://graph.microsoft.com/v1.0/groups/"
+ $group.id + "/planner/plans"
$plans = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
$plansContent = $plans.Content | ConvertFrom-Json
$planValue = $plansContent.Value
ForEach($plan in $planValue){
$docdbHeader=@{"Authorization" = "$planAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include"; `
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
$jsonBody = $plan | ConvertTo-Json -Compress
$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Plans/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody
#################################################
#Get Plan Details
#################################################
$groupId = $plan.owner
$uri = "
https://graph.microsoft.com/v1.0/planner/plans/"
+ $plan.id + "/details"
$planDetails = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
$planDetailsContent = $planDetails.Content | ConvertFrom-Json
# Only continue with this plan if the category1 matches our target paln type.
if($planDetailsContent.categoryDescriptions.category1 -eq "Marketing"){
$docdbHeader=@{"Authorization" = "$planDetailsAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include"; `
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
$jsonBody = $planDetailsContent | ConvertTo-Json -Compress
$docdbUri = "https://<yourCosmosDBName>.documents.azure.com/dbs/Planner/colls/PlanDetails/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody
#GroupMember
$uri = "
https://graph.microsoft.com/v1.0/groups/"
+ $groupId + "/members"
$members = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
$membersContent = $members.Content | ConvertFrom-Json
$membersContent | Add-Member id $plan.id
$membersValue = $membersContent.value
$docdbHeader=@{"Authorization" = "$planMembersAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include";
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
ForEach($member in $membersValue){
$jsonBody = $member | ConvertTo-Json -Compress
$docdbUri = "https://<yourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Members/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody
}
#Buckets
$uri = "
https://graph.microsoft.com/v1.0/planner/plans/"
+ $plan.id + "/buckets"
$buckets = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
$bucketsContent = $buckets.Content | ConvertFrom-Json
$bucketsContent | Add-Member id $plan.id
$bucketsValue = $bucketsContent.value
$docdbHeader=@{"Authorization" = "$planBucketsAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include";
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
ForEach($bucket in $bucketsValue){
$jsonBody = $bucket | ConvertTo-Json -Compress
$docdbUri = "https://<yourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Buckets/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbHeader -Method $verb -ContentType "application/json" -Body $jsonBody
}
#Tasks - call out to separate function
$planId = $plan.id
$taskBody=@{"planId" = "$planId"}
$body=$taskBody | ConvertTo-Json -Compress
$taskHeaders = @{}
$taskHeaders.Add('Content-Type', "application/json")
$uri = "https://<yourfunctionendpoint>.azurewebsites.net/api/<YourfunctionName>?code=<from your function Url>”
Start-Job {Invoke-WebRequest -Uri $using:uri -Method Post -Headers $using:taskHeaders -Body $using:body -UseBasicParsing}
}
}
}
catch [Net.WebException] {}
}
The 2nd Function – called from the last part of the one above:
# POST method: $req
$requestBody = Get-Content $req -Raw | ConvertFrom-Json
$planId = $requestBody.planId
# GET method: each querystring parameter is its own variable
if ($req_query_name)
{
$planId = $req_query_name
}
#Setup stuff for the DocDB REST calls
[System.Reflection.Assembly]::LoadWithPartialName("System.Web") | out-null
$global:resourceType="docs"
$verb="POST"
$date = Get-Date
$utcDate = $date.ToUniversalTime()
$global:xDate = $utcDate.ToString('r',[System.Globalization.CultureInfo]::InvariantCulture)
$global:masterKey = "<YourCosmosDB master key>”
# generate signature for REST call
function fnGetKey ($pVerb, $pResourceId, $pResourceType, $date, $masterKey)
{
$keyBytes = [System.Convert]::FromBase64String($masterKey)
$sigCleartext = @($pVerb.ToLower() + "`n" + $pResourceType.ToLower() + "`n" + $pResourceId + "`n" + $date.ToString().ToLower() + "`n" + "" + "`n")
$bytesSigClear =[Text.Encoding]::UTF8.GetBytes($sigCleartext)
$hmacsha = new-object -TypeName System.Security.Cryptography.HMACSHA256 -ArgumentList (,$keyBytes)
$hash = $hmacsha.ComputeHash($bytesSigClear)
$signature = [System.Convert]::ToBase64String($hash)
$key = $('type=master&ver=1.0&sig=' + $signature)
$key = [System.Web.HttpUtility]::UrlEncode($('type=master&ver=1.0&sig=' + $signature)) # needs Snapin System.web!
return $key
}
#Setup stuff for the Planner Calls
$clientId = ‘<A registered client ID in Azure with the right access levels>’
$aadtenant = "<yourtenant>.onmicrosoft.com"
$username = "<yourname>@<yourtenant>.onmicrosoft.com"
$password = "<your password>" | ConvertTo-SecureString -AsPlainText -Force
$Credential = New-Object -typename System.Management.Automation.PSCredential -argumentlist $username, $password
$Modulebase = (get-Module MicrosoftGraphAPI).ModuleBase
Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
Import-Module "D:\home\site\wwwroot\HttpTriggerPlannerToDocDB\Microsoft.IdentityModel.Clients.ActiveDirectory.WindowsForms.dll"
$resourceAppIdURI = “
https://graph.microsoft.com”
$authority = “
https://login.windows.net/
$aadTenant”
$authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority
$uc = new-object Microsoft.IdentityModel.Clients.ActiveDirectory.UserCredential -ArgumentList $Credential.Username,$Credential.Password
$token = $authContext.AcquireToken($resourceAppIdURI, $clientId,$uc)
# Blog Client ID - my Application ID from Azure
$clientId = ‘<A registered client ID in Azure with the right access levels>’
#Get DocDB REST authkeys
$global:resourceId= "dbs/Planner/colls/Tasks"
$planTasksAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/TaskDetails"
$planTaskDetailsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
$global:resourceId= "dbs/Planner/colls/Assignments"
$planAssignmentsAuthKey = fnGetKey $verb $resourceId $resourceType $xdate $masterKey
#Get Tasks stuff for planId
$headers = @{}
$headers.Add('Authorization','Bearer ' + $Token.AccessToken)
$headers.Add('Content-Type', "application/json")
$uri = " https://graph.microsoft.com/v1.0/planner/plans/" + $planId + "/tasks"
$tasks = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
$tasksContent = $tasks.Content | ConvertFrom-Json
$tasksValue = $tasksContent.value
$docdbTaskHeader=@{"Authorization" = "$planTasksAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include";
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
ForEach($task in $tasksValue){
$assignmentNames=@{}
$jsonBody = $task | ConvertTo-Json -Compress
$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Tasks/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbTaskHeader -Method $verb -ContentType "application/json" -Body $jsonBody
ForEach($assignment in ($task.assignments | Get-Member -MemberType NoteProperty)){
[array]$assignmentNames +=$assignment
}
ForEach($assignmentName in $assignmentNames){
$assignee=@{}
If($assignmentName.Name){
$assignee | Add-Member name $assignmentName.Name.ToString()
}
Else{
$assignee | Add-Member name "Unassigned"
}
$assignee | Add-Member planId $planId
$assignee | Add-Member taskId $taskId
$uniqueAssnId = $assignee.planId + "_" + $assignee.taskId + "_" + $assignee.name
$assignee | Add-Member id $uniqueAssnId
$docdbAssnHeader=@{"Authorization" = "$planAssignmentsAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include";
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
$jsonBody = $assignee | ConvertTo-Json -Compress
$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/Assignments/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbAssnHeader -Method $verb -ContentType "application/json" -Body $jsonBody
}
}
#Task Details
ForEach($task in $tasksValue){
$uri = "
https://graph.microsoft.com/v1.0/planner/tasks/"
+ $task.id + "/details"
$taskDetails = Invoke-WebRequest -Uri $uri -Method Get -Headers $headers -UseBasicParsing
$taskDetailsContent = $taskDetails.Content | ConvertFrom-Json
$docdbTaskDetailsHeader=@{"Authorization" = "$planTaskDetailsAuthKey"; `
"x-ms-version" = "2015-12-16"; `
"x-ms-date" = "$xdate"; `
"x-ms-indexing-directive" = "Include"; `
"x-ms-documentdb-is-upsert" = "True"; `
"Content-Length" = "0"
}
$jsonBody = $taskDetailsContent | ConvertTo-Json -Compress
$docdbUri = "https://<YourCosmosDBName>.documents.azure.com/dbs/Planner/colls/TaskDetails/docs"
Invoke-RestMethod -Uri $docdbUri -Headers $docdbTaskDetailsHeader -Method $verb -ContentType "application/json" -Body $jsonBody
}