Global Admin Pro Tip: Learn how to build video analytics reporting using Office 365 audit logs
Published Mar 12 2019 08:15 PM 46K Views
Microsoft

Today in Microsoft Stream, you can see analytics that show you the popularity of a video based on view, comment, and like counts. If you are a global administrator for Office 365, now you can set up a workflow to pull deeper analytics.


Today I’d like to share a way to access video engagement analytics by leveraging Office 365 Audit Logs, along with SharePoint and Microsoft Flow, to deliver reports in Power BI. I will note that I am not a Data Expert, Power BI Pro, or Developer – but if you are, you can likely take this much farther!


In this training we will cover the following:

 

  1. The data – Unified Audit Logs
  2. Setting up the SharePoint list as the data source
  3. Getting the data
  4. Pulling the data into Power BI
  5. Building a basic report
  6. Refreshing the data

Below are some basic examples of the insights you can derive from the data you have access to.

VideoReportPage.jpg

ManagementReportPage.jpgLocationReportPage.jpg

The data - Unified Audit Logs

In the security and compliance center of Office 365 (https://protection.office.com) we have a unified logging experience. This service tracks many actions that occur across Office 365. With Microsoft Stream this includes the following:

 

Microsoft Stream video activities (Individual video actions)

Created video

Edited video

Deleted video

Uploaded video

Downloaded video

Edited video permission

Viewed video

Shared video

Liked video

Unliked video

Commented on video

Deleted video comment

Uploaded video text track

Deleted video text track

Uploaded video thumbnail

Deleted video thumbnail

Replaced video permissions and channel links

Marked video public

Marked video private


Microsoft Stream group channel activities (Video organization actions)

Created group
Edited group
Deleted group
Edited group memberships
Created channel
Edited channel
Deleted a channel
Replaced channel thumbnails


Microsoft Stream general activities (Tenant specific admin actions)

Edited user settings
Edited tenant settings
Edited global role members
Deleted User
Deleted User’s Data Report
Edited User
Exported User’s Data Report
Downloaded User’s Data Report

 

Setting up SharePoint

We are going to want to dynamically update our dataset and ensure we have a consistent data connection for use in the Power BI report. To accomplish this, we will store the audit data in a SharePoint list. You could certainly send this to SQL as well, using SharePoint will make the data more accessible to all. For SharePoint we will create the list with columns to match up to what we get when we parse the audit logs. We will also add some additional columns to create friendly names that relate to some of data.

 

SharePoint List Fields

Field Name

Field Type

Notes

Operation (Title)

Single Line of Text

Flow will add the friendly name from the AuditDataOperation field. (InvokeStreamVideoLike to Video Liked)

CreationDate

Date and Time

Date the Action was performed

UserIds

Single Line of Text

Email of the user that took the action

AuditDataResourceUrl

Single Line of Text

Link to the item

AuditDataResourceTitle

Single Line of Text

Name of the item (Video, Channel, Group)

AuditDataOperationDetails

Multiple Lines of Text (Plain text)

Contains additional details that we don’t parse with a second pass. Some of the data is duplicative. The one field in here that may be of use is Description, but this data set varies depending on action so it could add additional complexities to the flow.

AuditDataClientApplicationId

Single Line of Text

GUID of Application used to perform the action

AuditDataObjectId

Single Line of Text

GUID of the item

AuditDataUserId

Single Line of Text

Email of the user that took the action (duplicate)

AuditDataClientIP

Single Line of Text

IP address the user took the action

AuditDataId

Single Line of Text

 

AuditDataRecordType

Single Line of Text

Microsoft Stream service

AuditDataCreationTime

Single Line of Text

Different way of consuming the action was taken down to the second.

AuditDataOperation

Single Line of Text

The internal action name

AuditDataOrganizationId

Single Line of Text

GUID that defines your tenant. Not important for single organization reports

AuditDataUserType

Single Line of Text

 

AuditDataUserKey

Single Line of Text

 

AuditDataWorkload

Single Line of Text

 

AuditDataResultStatus

Single Line of Text

 

AuditDataVersion

Single Line of Text

 

Year

Calculated

=TEXT(CreationDate,"YYYY")

Month

Calculated

=TEXT(CreationDate,"MMMM")

Date

Calculated

This is for the date only, no time - =TEXT(CreationDate,"MM/DD/YYYY")

User

Person or Group

Populated with ClientDataUserId

Location

Single Line of Text

Optional (uses a location API based on ClientIP)

Application

Single Line of Text

Flow automatically converts the Microsoft internal GUID to the matching application. (cf53fce8-def6-4aeb-8d30-b158e7b1cf83 = Stream Portal)

 

Getting the data

The audit logs are only available to specific members of an organizations such as the Global Administrators or other personnel they may have been assigned access. In most organizations if you want this data you will need to request an export. If you have access you get the data, you can follow the manual steps here.

 

There are various APIs and PowerShell commands available within the Microsoft 365 platform. For this solution we leveraged Microsoft Flow and used the Search-UnifiedAuditLog PowerShell cmdlet. You may wish to change to the Office 365 Management API, but this easily fit the needs for this project.


Never created a Microsoft Flow. Check out the getting started documentation.


Here is what was done to create a nightly pull of the data to drop in the SharePoint list that was created above. You can download the template to import into your Microsoft Flow environment from here, of course you can build it yourself by following the steps below. As mentioned above you will need the Global Admin role. If you do not you will need to work with a Global Admin on how to implement. They will likely need to build the Flow for you. Make sure to rename each action, some of the expressions used with need the Action names, otherwise you will need to refactor the expressions.

 

  1. In this scenario we have set the Recurrence trigger to run on a nightly basis at 11:59 PM. You can change this to fit your requirements. Just know that you will need to update the StartDate and EndDate variables to match.
    FlowStep1.png
  2. Use the Initialize Parameter action to store a global admin account or specific service account to the value field. (Reminder that the account needs access to the audit logs). This account is used in multiple locations in this solution.
    FlowStep2.png
  3. Add another Initialize Parameter action to store the password for the account above in the value field
    FlowStep3.png

     

  4. Add another Initialize Parameter action to store the start date for the query using startOfDay(utcNow()) as the value. (Keep in mind this will need to be adjusted if you change the recurrence)

    Tip: If you want to capture all of the data you may have in the audit logs before changing this to a nightly feed consider doing a test run using this expression, addDays(startOfDay(utcNow()), -90)

    FlowStep4.png

     

  5. Add another Initialize Parameter action to store the end date for the query using startOfDay(addDays(utcNow(),1,'MM/dd/yyyy')) as the value. (Keep in mind this will need to be adjusted if you change the recurrence)
    FlowStep5.png
  6. Add the HTTP Request action to build the query against the PowerShell Webservice. Make sure to select Get for the Method. The URI structure includes our StartDate and EndDate variables. We also include RecordType=MicrosoftStream to make sure to only get the audit data for Stream. At the end of the string we add the ResultSize parameter to go to the max allowed of 5,000 items, otherwise this would default to 100 (You can call back as many as 50k items but it is not recommended, you would need to introduce paging to the query). Make sure to set the Headers as shown in the image below. Once you click on advanced options you will see the Username and Password, simply add these variables to these values.
    FlowStep6.png
  7. Add the Parse JSON action. This is where we will extract the data from the logs. Add the Body from the HTTP action
    FlowStep7.pngAdd the Schema shown below
    {
        "type": "object",
        "properties": {
            "odata.metadata": {
                "type": "string"
            },
            "value": {
                "type": "array",
                "items": {
                    "type": "object",
                    "properties": {
                        "Identity": {
                            "type": "string"
                        },
                        "AuditData": {
                            "type": "string"
                        },
                        "CreationDate": {
                            "type": "string"
                        },
                        "IsValid": {
                            "type": "boolean"
                        },
                        "ObjectState": {
                            "type": "string"
                        },
                        "Operations": {
                            "type": "string"
                        },
                        "RecordType": {
                            "type": "string"
                        },
                        "ResultCount": {
                            "type": "number"
                        },
                        "ResultIndex": {
                            "type": "number"
                        },
                        "UserIds": {
                            "type": "string"
                        },
                        "ObjectIds": {},
                        "IPAddresses": {}
                    },
                    "required": [
                        "Identity",
                        "AuditData",
                        "CreationDate",
                        "IsValid",
                        "ObjectState",
                        "Operations",
                        "RecordType",
                        "ResultCount",
                        "ResultIndex",
                        "UserIds",
                        "ObjectIds",
                        "IPAddresses"
                    ]
                }
            }
        }
    }
  8. Add a Condition action. This will determine if any results were returned for the data range. For the value, add length(body('Parse_Stream_Results')?['value']). Change your condition to “is greater than” with the value as 0.
    FlowStep8.png
  9. In the Yes path add the Parse JSON action. This action will break out the AuditData field within the results as it contains most of the useful information for the reports. Select the AuditData from the earlier HTTP action, this will automatically add in the Apply to each record.

    FlowStep9.png
    Add the Schema shown below.
    {
        "type": "object",
        "properties": {
            "CreationTime": {
                "type": "string"
            },
            "Id": {
                "type": "string"
            },
            "ResultStatus": {
                "type": "string"
            },
            "UserType": {
                "type": "number"
            },
            "Version": {
                "type": "number"
            },
            "UserId": {
                "type": "string"
            },
            "ResourceUrl": {
                "type": "string"
            },
            "ResourceTitle": {
                "type": "string"
            },
            "EntityPath": {
                "type": "string"
            },
            "OperationDetails": {
                "type": "string"
            },
            "ClientApplicationId": {
                "type": "string"
            },
            "ObjectId": {
                "type": "string"
            },
            "ClientIP": {
                "type": "string"
            },
            "RecordType": {
                "type": "number"
            },
            "Operation": {
                "type": "string"
            },
            "OrganizationId": {
                "type": "string"
            },
            "UserKey": {
                "type": "string"
            },
            "Workload": {
                "type": "string"
            }
        }
    }
  10. This step is optional. If you want to get approximate location for where the actions were performed, you can leverage the IP address to get the details. This allows us to build the map into our reports. Alternatively, if you have your user’s location information tracked in their profile you could look up the user profile. 

    Note: This uses a service from http://ipwhois.io . Organizations will need to purchase this, for a very small fee if you would like to use it. It is $99 a year for unlimited queries. The free version allows for 1000 queries a day. If you already have a preferred service that can do this you can use that instead.

    Add the HTTP action. Make sure to select Get for the Method. The URI value is http://free.ipwhois.io/json/ ClientIP value from the previous action. Make sure to add the Headers as shown below. The free version doesn’t require authentication, but you will need to add that in if you use the pay service.
    FlowStep10.png
  11. Only required if you use the previous step. Add the Body from the previous action.

    FlowStep11.png
    Add the Schema shown below.
    {
        "type": "object",
        "properties": {
            "ip": {
                "type": "string"
            },
            "success": {
                "type": "boolean"
            },
            "type": {
                "type": "string"
            },
            "continent": {
                "type": "string"
            },
            "continent_code": {
                "type": "string"
            },
            "country": {
                "type": "string"
            },
            "country_code": {
                "type": "string"
            },
            "country_flag": {
                "type": "string"
            },
            "country_capital": {
                "type": "string"
            },
            "country_phone": {
                "type": "string"
            },
            "country_neighbours": {
                "type": "string"
            },
            "region": {
                "type": "string"
            },
            "city": {
                "type": "string"
            },
            "latitude": {
                "type": "string"
            },
            "longitude": {
                "type": "string"
            },
            "asn": {
                "type": "string"
            },
            "org": {
                "type": "string"
            },
            "isp": {
                "type": "string"
            },
            "timezone": {
                "type": "string"
            },
            "timezone_name": {
                "type": "string"
            },
            "timezone_dstOffset": {
                "type": "string"
            },
            "timezone_gmtOffset": {
                "type": "string"
            },
            "timezone_gmt": {
                "type": "string"
            },
            "currency": {
                "type": "string"
            },
            "currency_code": {
                "type": "string"
            },
            "currency_symbol": {
                "type": "string"
            },
            "currency_rates": {
                "type": "string"
            },
            "currency_plural": {
                "type": "string"
            }
        }
    }
  12. Add the SharePoint Create Item action. Select the site where you created the list. Select the Audit Data list.
    FlowStep12.png
    1. For Title we converted to the Friendly Name of the action using an If expression.

      FlowStep12a.png
      if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamCreateVideo'),'Created video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamCreateVideoComment'),'Commented on video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteChannel'),'Deleted channel',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteGroup'),'Deleted group',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteUser'),'Deleted user', if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteUserDataExport'),'Deleted User Data Export',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteVideo'),'Deleted video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteVideoComment'),'Deleted video comment',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteVideoTextTrack'),'Deleted text track',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamDeleteVideoThumbnail'),'Deleted thumbnail',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditAdminGlobalRoleMembers'),'Edited global role assignment',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditAdminTenantSettings'),'Edited tenant settings',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditChannel'),'Edited channel',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditUser'),'Edited user',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditGroupMemberships'),'Edited group memberships', if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditGroup'),'Edited group',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditUserSettings'),'Edited user settings',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditVideo'),'Edited video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditVideoPermissions'),'Edited video permission',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditLiveIngest'),'Edited live ingest',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamEditLiveEvent'),'Edited live event',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeChannelSetThumbnail'),'Set channel thumbnail',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeExportUserData'),'Exported user data',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeUserDataExportDownload'),'Downloaded user data',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoDownload'),'Downloaded video', if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoLike'),'Liked video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoMakePublic'),'Made video public',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoMakePrivate'),'Made video private',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoSetLink'),'Linked on Video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoShare'),'Shared video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoTextTrackUpload'),'Uploaded text track',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoThumbnailUpload'),'Uploaded thumbnail',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoUnLike'),'Unliked video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoUpload'),'Uploaded video', if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoView'),'Viewed video',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeLiveStartBroadcast'),'Started Live Broadcast',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeLiveStartIngest'),'Started Ingest',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeLiveStopIngest'),'Stopped Ingest',if(equals(body('Parse_Stream_AuditDetails')?['Operation'],'StreamInvokeVideoPostProcess'),'Video in post process', 'Unknown')))))))))))))))))))))))))))))))))))))))
    2. Enter the next values as shown below. If you need to remove personal data do not add in UserId to anywhere shown here. You may also need to remove IP address.
      FlowStep12b.png
    3. For Location add the value of concat(body('Parse_Location')?['city'],', ',body('Parse_Location')?['region'])
    4. For Application add the If Statement shown below. This will provide the friendly name of the application used to create the video. In the future it may be used to show what application the videos were viewed from.
      if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'cf53fce8-def6-4aeb-8d30-b158e7b1cf83'),'Stream Portal',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'844cca35-0656-46ce-b636-13f48b0eecbd'),'Stream Mobile',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'574df941-661b-4bfc-acb0-0a07de7de341'),'Dynamics Talent',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'00000005-0000-0ff1-ce00-000000000000'),'Yammer',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'4580fd1d-e5a3-4f56-9ad1-aab0e3bf8f76'),'Microsoft Teams Exporter',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'cc15fd57-2c6c-4117-a88c-83b1d56b4bbe'),'Microsoft Teams Middle tier',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'1fec8e78-bce4-4aaf-ab1b-5451cc387264'),'Microsoft Teams Native Client',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'5e3ce6c0-2b1f-4285-8d4b-75ee78787346'),'Microsoft Teams Web Client',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'7557eb47-c689-4224-abcf-aef9bd7573df'),'Microsoft Teams Scheduling and Broadcast Service',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'e478694e-c391-4a61-a6be-c953d2372058'),'OfficeMix.com',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'d3590ed6-52b3-4102-aeff-aad2292ab01c'),'Office First Party native app',if(equals(body('Parse_Stream_AuditDetails')?['ClientApplicationId'],'2acd2ef7-b80e-46d2-87f8-90e10adc3fca'),'Microsoft Learning 3rd party app','Unknown'))))))))))))
    5. For the last value add OperationDetails
      FlowStep12e.png
  13. The last step is for emailing the user running the report and is not part of the apply to each section. Add the desired email in the To field, for this we just used the admin account variable. Add the desired subject and body. You can add HTML to the email if you wish, just make sure to expand the advanced options and change IsHTML to Yes.
    FlowStep13.png

Pulling the Data into Power BI

This is where we can pull in the data set to build some nice reports. You will need to download Power BI desktop to build these reports since we need to connect to the SharePoint list created earlier.

 

  1. Open Power BI Desktop
  2. Create New
  3. Click Get Data
  4. Select Online Services > SharePoint Online List
    PBIStep4.png
  5. Click Connect
  6. Add the URL to the site where you created the SharePoint List
  7. Click Ok
  8. Select the List you created for the Audit Data
    PBIStep8.png
  9. Click Load

Building a Basic Report

Now we can add some basic report information and filters. Below are basic examples of items you can add. You can download this prebuilt report to see how I built it out. Make sure to update the settings for your environment if you use this template.


Slicers

  1. Select the Slicer under visualizations
    ReportSlicerStep1.png
  2. Check the Date field (note that you may need to convert the data type in your table to get this to display as a slider, Power BI may convert automatically but it depends on the amount of content)
    ReportSlicerStep2.png

Total views
This graph shows the total views per video. Depending on the number of videos this chart may not work for your needs.

 

  1. Select the Donut or Pie Chart
    ReportTotalViewsStep1.png
  2. Check the AuditData.ResourceTitle so that it is used by the chart
  3. Drag the Operations field into Values (this should convert to count)
  4. Drag the Operations filed to the Filters View and check StreamInvokeVideoView
    ReportTotalViewsStep4.png

Total views per day
This line chart will show total views daily.

 

  1. Select the Line Chart
    ReportTotalViewsDayStep1.png
  2. Check the Date so that it is used by the chart
  3. You may want to remove items like Year and Quarter from the date field. Just click on the x to remove.
    ReportTotalViewsDayStep3.png
  4. Add the Title field to the values box
  5. Drag the Title field into the filter
  6. Select Video Viewed

Total view cards
Use this to display the total views for all videos. When filtering based on video this will adjust accordingly.

 

  1. Select the Card
    ReportTotalViewCardsStep1.png
  2. Check Title
  3. Change the field to be count, via the drop-down of the menu.
    ReportTotalViewCardsStep3.png
  4. Drag Title into the Filters
  5. Select Video Viewed

Publishing the Report

To schedule the refresh for the report it must first be published to the Power BI service. Users that view the report will need to have a Power BI Pro license if you utilize scheduled refresh to keep the data up to date.

 

  1. In your report, click Publish
  2. Select the workspace location to publish it to
    PBIPublishStep2.png
  3. Click Select
  4. Click Got it
  5. Login to Power BI online and go to the workspace you published to above
  6. Locate the report dataset
    PBIPublishStep6.png
  7. Click on the … next to your dataset and click Schedule Refresh
  8. Expand Schedule Refresh
    PBIPublishStep8.png
  9. Turn on “Keep your data up to date”
  10. Choose your “Refresh frequency”
  11. Click on “Add another time” If you are doing the daily audit log, I recommend setting your update refresh at 1:00 am. This should provide the ability for all the new data to get pushed into SharePoint from the audit log import.

    PBIPublishStep11.png
  12. Click Apply

References:

Thanks for reading and happy streaming!

27 Comments
Copper Contributor

Useful tutorial, however things like views/starts only really gives a vague glimpse of engagement of a video. Other more useful metrics are things like percentage of users who watched 50%/75% of the video. How could this be captured?

Microsoft

@Rory Champion - Stream doesn't yet capture percentage based viewing. We are planned in 2019 to start working on this sort of tracking so you can get insights like which part of the video was viewed.

Brass Contributor

This is great!  I appreciate the level of detail provided in how this was setup and it clearly shows the connections/integrations between the different Office 365 services.

Copper Contributor

Is there any update to this to access the Audit Logs via the management API's as we are not able to use Basic Auth as outlined here.

Microsoft

@Graham Parkin  You can create a custom connector that leverages the same search so that you aren't saving the account information in the Flow. See this post that came out this month. https://flow.microsoft.com/en-us/blog/custom-connector-microsoft-flow-search-unified-audit-log/ This option still shows as Basic Authentication but if your concern was only saving it in the Flow then maybe this can work as your solution.

Deleted
Not applicable

This is fantastic @Cory Williams !

 

One issue I ran into when pulling older data was that the User column would error (because that user account is no longer available - since the person left the company).  I tweaked the Flow to handle that and everything is good.

Copper Contributor

Great article. 

Brass Contributor

Great article. We ran the Flow part to get back the Stream items into SharePoint but it only always returns 400 items instead of the expected 5000. Can you assist @Cory Williams or @Marc Mroz ?

Copper Contributor

The audit logs for views of a video are inconsitent with the views of the video that show up on the Stream site.  Any reason for this?

Microsoft

@Chetan_M I will need to verify but what I expect is that these two are measure by different metrics. The audit logs are going to show even if the person just clicked play and immediately left where as I believe the views on the page only count after the video has been watched for at least 6 seconds. @Marc Mroz is this correct?

Microsoft

@JohnnyLopez thats interesting. It would likely take some time to track that down as it could be any one of the calls for the flow, audit logs, putting items into SP, flow timing out. 

Copper Contributor

@Cory Williams  This is indeed a very good article. Thank you for sharing this. Am need of your help. I am also facing the same issue as @JohnnyLopez  and @Chetan_M . Initially the list was having 400 items. And then around 700 items for the second run. Actually I took a stream video which was posted recently(a month back) ,compared the parameters such number of views, likes etc in the list and it does not provide the exact results. The video was actually having  5 likes, 250 views.  But the list data showed only 4 items for that particular video. I am bit confused on this. Will it not return all 3 months data in the first run ? Could you please help on this

Copper Contributor

@Cory Williams @Marc Mroz Were you able to confirm and if so is there a way to filter out the views <6 seconds from the logs so we can get consistency in the data?

Deleted
Not applicable

Hey @Cory Williams - I've been running this for a couple of months now and found that once every week or so, it'll timeout.  I suspect that every once in a while the log data is too much for the Flow limitations.

 

Today, I exported the flow and imported it to Azure as a logic app, which I think will fix the issue.

Great post. Are there plans to provide Stream actions for Microsoft Flow rather than using HTTP actions?

Microsoft

@Simon Denton I imagine there are, but they would be dependent on the public APIs being made available for Stream which as of right now according to the roadmap is Q2 2020. So still a ways out. I am getting some feedback that there are timeouts or that the pull maxes out at 400 items so when I get time will be looking at using a different API or converting as @Deleted did above to a logic app. @Deleted did you see an improvement?

Thanks @Cory Williams . In my experience sticking to the 100 item limit and using a loop is a much more reliable method.

Deleted
Not applicable

@Cory Williams - Yes, much more reliable.  I saw one failure in the past week that I'm going to dig into at some point.

 

I have a second of the same logic app, but instead of running on a schedule, it is triggered by a SharePoint list.  Whenever I see an error in the scheduled logic app, I'll manually run the second one by creating an entry in the list for that date.  I did this for the initial import, when I wanted to grab the last 90 days without overloading the Flow (it was a Flow back then).

Copper Contributor

I got this set up, but it only pulled the data from the previous day if I made the start date parameter addDays(startOfDay(utcNow()), -1). Maybe this is a silly workaround, but at least I have it working! Thanks for the tip!

Bronze Contributor

How about the video's own statistics? Are there possibilities to get list of video quality levels, what resolutions are available, etc...? Can we list of the source of the video (e.g. can we find Teams recordings)? Possibilities to list times, how long time the recordings has taken.

I have very special needs, as some of our users has reported about poor quality on videos.

Microsoft

@Petri X video specific data that you are looking for here is not available today. It is something that is on the roadmap for Q2 2020 although not quite sure if it will touch on your specific needs. Its hard to say since I am not on that team. https://www.microsoft.com/en-us/microsoft-365/roadmap?rtc=2&filters=&searchterms=stream%2Cvideo%2Can... If you havent seen it there is documentation regarding how the videos are encoded once they are uploaded. https://docs.microsoft.com/en-us/stream/network-overview

Copper Contributor

I found this tool and its wonderful.  I do have an issue.  When it runs at 23:59 and captures the audit data and the created date does not match the audit log date.  In the SharePoint list all entries are one day off.  

 

Did not make any changes to recurrence, start or end time 

 

example.  audit log entry time =  1/22/2020   SharePoint list time =  1/23/2020

 

Thanks

Copper Contributor

Hi,

 

I've been trying to get this to work for some time but it doesn't If I use your sample report I can't get it to refresh because I don't have credentials, which is fine. If I point it to the sharepoint list I created based on your instructions it complains there are columns missing. Specifically the user.title, etc. line doesn't work. I've managed to get it to go past that line in the advanced query editor but when I try to refresh it says ID 2 doesn't exist in the rowset. If I manually add that then it says ID 3 doesn't exist. If I add that it says ID4 doesn't exist and so on.

Your instructions are incomplete on how to build this the same as what's in your sample isn't what you get if you follow the instructions. Please explain how I can redirect your sample data to my list I created.

 

Thanks


Stuart

Iron Contributor

@Cory Williams I had this working for a while, and now it no longer works as it fails on the GET to https://outlook.office365.com/psws/service.svc/UnifiedAuditLog URI.   Did something change on the Microsoft to not allow this any longer?  I can't get to this with any account I use in any tenant I have.  Appreciate any insight.  Thanks!

Brass Contributor

This works but it caps me at 400 list items. It was not like this during initial runs so i am guessing that i have used up my monthly quota of pulling audit logs

Brass Contributor
Iron Contributor

@AmarHonnungar I recently experienced this limit as well and is frustrating.  This makes it impossible now to setup back up using a clean list and starting from scratch.  I've abandoned for now.  Hopefully better options ahead, especially with Stream moving to SharePoint/OneDrive. 

Version history
Last update:
‎Mar 12 2019 08:36 PM
Updated by: