Blog Post

Planner Blog
8 MIN READ

Microsoft Planner: Considerations for Reporting - Part 4 of 3

DeletedBrianSmith's avatar
DeletedBrianSmith
Brass Contributor
Mar 06, 2019

First published on MSDN on Sep 29, 2017

As I mentioned in Part 3 – there are a few other things I wanted to cover, so here is part 4.  If you didn’t see the other parts – then here are the links – so come back when you’ve read them.

The aim of the series wasn’t to provide a finished answer to every possible reporting scenario, but really to expose you to the data model and how you can pull the data, and along with this make you think about really what you need and how you might use and govern the usage of planner to achieve that goal.  And even in thinking through that decide if Planner is really the right tool?  If you wanted reports that show resource capacity, usage and actual work then maybe Project Online is where you need to be?  The recent release of the link capability from Project Online to Planner opens up new opportunities to use the products together (as do other 3rd party offerings) If Planner is the right place then you need to consider how you ensure your reporting is valid – so some form of templating and filtering to make your reports consistent will certainly be necessary.

Another aim for me was to look at some different technologies along the way – so I touched on Flow, Azure Functions and Cosmos DB – as well as Power BI which I will come back to later in this posting.  I’m really sold on Flow, and I’ll be doing some other postings soon on different scenarios around Planner and Project Online.  I’ll also be looking closer at Logic Apps in Azure – very similar in capabilities to Flow and PowerApps.  Flow and Azure Functions make a great combination and I’ll also be looking at C# Functions and see what extra performance they give (as well as potentially reduced cost if they use fewer resources than my PowerShell examples – we will see).  But that is for another day… I initially prepared the topic of this blog as a technical session for Microsoft’s internal training event Ready, and re-worked some of it for the blog, and if I was starting again I’d probably re-work even more.

I’m not sure Azure Cosmos DB is the right data store for this kind of reporting solution but I enjoyed learning about its capabilities and would certainly look to this as a back-end for any application development I might do – as it has some great capabilities.  If I was using Cosmos DB I’d also use a different approach and create a document for each plan – and include the plan details, tasks, task details and assignments inside the document.  My approach showed my background in relational SQL databases and trying to make collections = tables.

For those keen to learn more I ‘discovered’ Cosmos DB  (under the previous DocumentDB name) while working through the Microsoft Professional Program for Big Data .

You can ‘audit’ the edX courses for free – some great sessions in this program as well as the Data Science track .

I more recently used the Lynda resource Cosmos DB Developer Deep Dive - https://www.lynda.com/Azure-tutorials/Cosmos-DB-Developer-Deep-Dive/612187-2.html .

If you do play around with Cosmos DB then I’d certainly suggest the resources above – and be sure to set the Throughput (RU/s) to 400 for starters so you don’t get any billing surprises.  You can set this when creating the database and also when creating collections.  Also be sure you clean things up when you aren’t using it – either drop the collections (an empty DB does not consume much) or delete the DB.  I ended up using the new Azure Cloud Shell and a couple of scripts to do the creation/deletion of my collections.  The Geo capabilities of Cosmos DB are really easy to set up – you can add read copies of your database in other geographies – and even fail over to make the other region the write region.

This can be a very useful if for example you were developing a solution in the US that would eventually be rolled out and have most users in Europe – you could start your work on a local write region in the US, then add a Europe read region and fail-over (and potentially delete the US read region) once you went into production.  If you are just kicking the tyres however, you will want to remove additional regions once you have finished playing with them – as otherwise you will get billed for the other regions.

Another great source of Azure information is The Azure Podcast , and with Ignite 2017 coming to a close there are also plenty of recorded sessions to review.

I should also point out that you’d want to test that the Graph calls that I’ve used do what you need them to do, and that the account you use has all the permissions required and is finding the plans.  In my case they were all my plans created centrally – check to make sure you aren’t getting throttled or hitting any limits on plan or task ownership.  As I’ve mentioned in the earlier posts – wherever you are pushing the data you may want to trim out field/properties you aren’t interested in – or potentially add other metadata that adds value for your scenario,

The final part I wanted to touch on in this ‘Part 4’ of the trilogy was directly reporting on Cosmos DB from Power BI – which is in Beta right now.

You can connect to Cosmos DB using the Azure Cosmos DB (Beta) option in the Get Data panel – under Azure:

You will initially get a pop-up mentioning this is a preview connector – and I’ve seen it enough times so I’m checking the box to not show it again:

Next you get to enter the URL of your Cosmos DB, and optionally the Database and Collection.

I’m entering my URL and the Database then clicking OK and I’ll select the Collection later.  Optionally too you could enter a SQL statement to make your data selection.

I then am presented with my list of collections – and to start with I’m selecting Plans.  You can see that I just see one column of ‘records’ title Document.  Clicking Edit allows me to see what the data is and choose what I want to get.

Clicking Edit opens the query editor (If I had selected more than 1 collection I could have then selected query editor later) and to expand the records I can click the icon top right of the Document column.

I can select the ‘columns’ I want and note the warning message that List may not be complete.  If for example there was data that was not present in the first sampled rows you wouldn’t see them listed.  This isn’t an issue in this case – but expanding other columns later we will see when this might be an issue.  For now I am only interested in the createdBy, owner, title and id.

This will then display the data in the grid – and we can see our owner (actually the GUID of the Group that this Plan belongs to) and our title and the Plan ID – and that createdBy is also another record with additional data inside.

If we try to expand we can see that there are two more columns – user and application.  I’m only really interested in user but I’ll leave all checked just to show the behavior.

And we have more levels to go…

This just maps on the the json of the Plan – so looking in Cosmos DB Data Explorer and querying for the record that is the first in our list – Tech Summit 2017 – using the filter Select * FROM c where c.title = ‘Tech Summit 2017’ I can see that createdBy contains user and application – and each of those has a displayname and id.

As I’m only interested in the id of the user I deleted the application column, then expanded the user column but only selected id.

At this stage you will likely want to rename the columns – to at least drop the ‘Document’ and depending on the data you may need to change the type.  Generally all data from the json will be interpreted as text – so you will need to map to different number formats or possibly dates in some cases.  The usual time I remember this is when Power BI will only count things – and not show me the totals I expect!

I’m happy with the Plan feed – and don’t need to change types – so I can just choose Close and Apply.

Going through a similar set of steps for the Tasks collection and choosing the columns I am interested in shows that some of these are also their own records within the json – appliedCategories and assignments (I should really have unchecked the ‘use original column name as a prefix…)

Expanding the appliedCategories column I can see I have the categories – but am missing category4.

This is where ‘Load more’ is important.

Clicking ok expands out the categories – by adding 6 columns which contain TREU or null – indicating which categories are set for these tasks.  The actual names of the categories can be pulled from PlanDetails – but in this case as my plans were filtered to all be ones created from my template I know what my categories should be – and they are the same for all PlanIDs.  In this case I could probably just do sme kind of substitution – or that may have been a transform that could have been handled when the plan data was pulled before pushing onto my data store.  A choice you can make.

Here I’m using Power BI to replace the TRUE values in category1 with Marketing after first changing the column from Boolean to Text.

Moving on to Assignments – this is where things get a little messy – as they follow the same find of thing as the categories. but potentially there could be a lot more values – and having many columns with headers that are the GUID of the resource and then having to replace the ‘true’ with names would quickly get tedious.  Another case where transforming the data earlier makes sense.  And it was for this very reason that I reworked things to pull out a separate Assignments Collection!

Some data type changes to ensure my percent complete and counts of checkbox items  are integers and my dates are date/time and I’m good to go!

Following similar steps I can get my assignments and members in and then link my plans to my tasks and tasks to my assignments and assignments to members.  For members I’m only interested in the id and displayName.

Then with a bit of filtering it is easy to get a list of my assignments across all plans where the task has yet to be started.  It wouldn’t take much effort here to also change my column heading of percentComplete to Progress, and the values of 0 to Not started – to match up better to the UI of Planner.

Updated Oct 14, 2019
Version 4.0
No CommentsBe the first to comment