Audit
61 TopicsPowershell Script to list ALL videos in your 365 Stream environment
I hope this is useful to everyone. My goal was to get a list of all videos in my stream so that I could contact each video creator about the changes that are coming to stream. Also so I could figure out how much work there was to do in moving, and how much video is created and not used. My original solution (posted here 7 Oct 2020) I've retired, as thanks to Twan van Beers code here https://neroblanco.co.uk/2022/02/get-list-of-videos-from-microsoft-stream/ I've been able to build a single ps1 script that does all I need. I'll leave the the original code at the bottom of this post. Take a look a the comments I've put into the code for how to use this script. I found that my original script gave me information about the videos but it was hard to use AND didn't tell me which videos were in which channels. This new version of Twan van Beers script gives me both. Save the new code Oct 2022 as a PowerShell script e.g. get-stream-video-info.ps1 Then open a powershell screen, navigate to the folder get-stream-video-info.ps1 is in. To run it enter .\get-stream-video-info.ps1 "C:\Temp\streamanalysis\stream7Oct22-getnbstreamvideoinfo.csv" Follow the on screen prompts. >>> New Code Oct 2022 <<< using namespace System.Management.Automation.Host [CmdletBinding()] param ( [parameter(Position=0,Mandatory=$False)] [string]$OutputCsvFileName, [parameter(Position=1,Mandatory=$False)] [switch]$OpenFileWhenComplete = $False ) # ---------------------------------------- # How to use <#-- Open a powershell window then type in the following and click enter .\get-stream-video-info.ps1 "C:\Temp\streamanalysis\stream7Oct22-getnbstreamvideoinfo.csv" You'll then be prompted for 3 options [V] Videos [C] ChannelVideos [A] All [?] Help (default is "V"): V - videos, which will get a list of all the videos in your STREAM environment. NOTE you may need to alter the variables in the code if you have more than 1000s of videos C - ChannelVideos, will get a list of all the videos and the channels they are in. NOTE this returns a filtered view of all the videos associated with a channel A - All, returns both the Videos and the ChannelVideos. You'll then be prompted for a user to login to the STREAM portal with, this is so the script can get a security token to do it's work with. Choose/use an account with full access to STREAM. If you used a CSV file path after the script name, then this powershell script will export one or two CSV files based on the option chosen <your folder path, your filename>-videos<your file ending> and or <your folder path, your filename>-channelVideos<your file ending> If you don't want to export file names, this powershell creates objects you can use in other ways V or A - will create an object $ExtractData, which is a list of every video and key properties for each video. C or A - wil create an object $videosPerChannel, which lists key information about each video AND the channel they are part of. ---------------------------------------------------------------------------------------------- original source my script https://techcommunity.microsoft.com/t5/microsoft-stream-classic/powershell-script-to-list-all-videos-in-your-365-stream/m-p/1752149 which inspired Twan van Beers to write https://neroblanco.co.uk/2022/02/get-list-of-videos-from-microsoft-stream/ I've then taken Twan's script and modified it to do what I require in my environment. Namely - get the video information AND the channels they are part of. For my 1000 or so videos and 35 channels, it takes about 1 min to run using the All option. This meant I was able to setup an intranet video library with a channel metadata column, folders per channel (so i could give edit rights to channel owners, without opening up the entire library or having to use multiple libraries), and eventually download the videos, then upload them into the library using ShareGate to reinstate some of the key metadata, i.e. created date, person who created them etc --#> # ---------------------------------------------------------------------------------------------- function Show-OAuthWindowStream { param ( [string]$Url, [string]$WindowTitle ) $Source = ` @" [DllImport("wininet.dll", SetLastError = true)] public static extern bool InternetSetOption(IntPtr hInternet, int dwOption, IntPtr lpBuffer, int lpdwBufferLength); "@ $WebBrowser = Add-Type -memberDefinition $Source -passthru -name $('WebBrowser'+[guid]::newGuid().ToString('n')) $INTERNET_OPTION_END_BROWSER_SESSION = 42 # Clear the current session $WebBrowser::InternetSetOption([IntPtr]::Zero, $INTERNET_OPTION_END_BROWSER_SESSION, [IntPtr]::Zero, 0) | out-null Add-Type -AssemblyName System.Windows.Forms $Form = New-Object -TypeName System.Windows.Forms.Form -Property @{Width = 600; Height = 800 } $Script:web = New-Object -TypeName System.Windows.Forms.WebBrowser -Property @{Width = 580; Height = 780; Url = ($URL -f ($Scope -join "%20")) } $Web.ScriptErrorsSuppressed = $True $Form.Controls.Add($Web) $Featured = { $Head = $Web.Document.GetElementsByTagName("head")[0]; $ScriptEl = $Web.Document.CreateElement("script"); $Element = $ScriptEl.DomElement; # Javascript function to get the sessionInfo including the Token $Element.text = ` @' function CaptureToken() { if( typeof sessionInfo === undefined ) { return ''; } else { outputString = '{'; outputString += '"AccessToken":"' + sessionInfo.AccessToken + '",'; outputString += '"TenantId":"' + sessionInfo.UserClaim.TenantId + '",'; outputString += '"ApiGatewayUri":"' + sessionInfo.ApiGatewayUri + '",'; outputString += '"ApiGatewayVersion":"' + sessionInfo.ApiGatewayVersion + '"'; outputString += '}'; return outputString; } } '@; $Head.AppendChild($ScriptEl); $TenantInfoString = $Web.Document.InvokeScript("CaptureToken"); if( [string]::IsNullOrEmpty( $TenantInfoString ) -eq $False ) { $TenantInfo = ConvertFrom-Json $TenantInfoString if ($TenantInfo.AccessToken.length -ne 0 ) { $Script:tenantInfo = $TenantInfo; $Form.Controls[0].Dispose() $Form.Close() $Form.Dispose() } } } $Web.add_DocumentCompleted($Featured) $Form.AutoScaleMode = 'Dpi' $Form.ShowIcon = $False $Form.Text = $WindowTitle $Form.AutoSizeMode = 'GrowAndShrink' $Form.StartPosition = 'CenterScreen' $Form.Add_Shown( { $Form.Activate() }) $Form.ShowDialog() | Out-Null write-output $Script:tenantInfo } # ---------------------------------------------------------------------------------------------- function Get-RequestedAssets([PSCustomObject]$Token, [string]$Url, [string]$Label) { $Index = 0 $MainUrl = $Url $AllItems = @() do { $RestUrl = $MainUrl.Replace("`$skip=0", "`$skip=$Index") Write-Host " Fetching ... $($Index) to $($Index+100)" $Items = @((Invoke-RestMethod -Uri $RestUrl -Headers $Token.headers -Method Get).value) $AllItems += $Items $Index += 100 } until ($Items.Count -lt 100) Write-Host " Fetched $($AllItems.count) items" $Assets = $AllItems | Select-Object ` @{Name='Type';Expression={$Label}},` Id, Name,` @{Name='Size(MB)';Expression={$_.AssetSize/1MB}}, ` PrivacyMode, State, VideoMigrationStatus, Published, PublishedDate, ContentType, Created, Modified, ` @{name='Media.Duration';Expression={$_.Media.Duration}},` @{name='Media.Height';Expression={$_.Media.Height}},` @{name='Media.Width';Expression={$_.Media.Width}},` @{name='Media.isAudioOnly';Expression={$_.media.isAudioOnly}},` @{name='Metrics.Comments';Expression={$_.Metrics.Comments}},` @{name='Metrics.Likes';Expression={$_.Metrics.Likes}},` @{name='Metrics.Views';Expression={$_.Metrics.Views}}, ` @{name='ViewVideoUrl';Expression={("https://web.microsoftstream.com/video/" + $_.Id)}}, ` @{name='VideoCreatorName';Expression={$_.creator.name}}, ` @{name='VideoCreatorEmail';Expression={$_.creator.mail}}, ` @{name='VideoDescription';Expression={$_.description}} write-output $Assets } function Get-VideoChannels([PSCustomObject]$Token, [string]$Url, [string]$Label) { #this will get the list of channels $Index = 0 $MainUrl = $Url $AllItems = @() do { $RestUrl = $MainUrl.Replace("`$skip=0", "`$skip=$Index") Write-Host " Fetching ... $($Index) to $($Index+100)" $Items = @((Invoke-RestMethod -Uri $RestUrl -Headers $Token.headers -Method Get).value) $AllItems += $Items $Index += 100 } until ($Items.Count -lt 100) Write-Host " Fetched $($AllItems.count) items" #to add properties to this section look at https://aase-1.api.microsoftstream.com/api/channels?$skip=0&$top=100&adminmode=true&api-version=1.4-private $Channels = $AllItems | Select-Object ` @{Name='Type';Expression={$Label}},` Id, Name, Description,` @{Name='MetricsVideos';Expression={$_.metrics.videos}} #write-host $channels.count write-output $Channels } function Get-channelVideos([PSCustomObject]$Token, [PSCustomObject]$allChannels, [string]$Label) { #this will get the list of channels $MainUrl = "https://aase-1.api.microsoftstream.com/api/channels/ChannelIDToSwap/videos?`$top=50&`$skip=0&`$filter=published%20and%20(state%20eq%20%27completed%27%20or%20contentSource%20eq%20%27livestream%27)&`$expand=creator,events&adminmode=true&`$orderby=name%20asc&api-version=1.4-private" #for each channel URL go through all the videos, capture the channel name against the video id and name $allVideosPerChannel = @() foreach($chan in $allChannels) { $thisChannelid = $chan.id $chanUrl = @( $MainUrl.Replace("ChannelIDToSwap", $thisChannelid) ) $chanName = $chan.name $AllItems = "" $items = "" $thischanvideos = "" $Index = 0 #write-host $chanUrl #loop the index do { $RestUrl = $chanUrl.Replace("`$skip=0", "`$skip=$Index") #write-host $restUrl #Write-Host "$chanName | Fetching ... $($Index) to $($Index+50)" $Items = @((Invoke-RestMethod -Uri $RestUrl -Headers $Token.headers -Method Get).value) $allItems = $items | select id,name, @{Name='Channel';Expression={$chanName}},@{Name='Type';Expression={$Label}} #write-host $allItems.count #foreach($x in $items ) { # write-host $x.name #write-host $x.id #write-host $label #write-host $chanName #} $Index += 50 } until ($Items.Count -lt 100) #got videos into $items, now mist with $chan info and put into $allVideosPerChannel object $allVideosPerChannel += $AllItems $AllItems = "" $items = "" } Write-Host " Fetched $($allVideosPerChannel.count) videos in $($allChannels.count) channels" #to add properties to this section look at https://aase-1.api.microsoftstream.com/api/channels?$skip=0&$top=100&adminmode=true&api-version=1.4-private write-output $allVideosPerChannel } # ---------------------------------------------------------------------------------------------- function Get-StreamToken() { $TenantInfo = Show-OAuthWindowStream -url "https://web.microsoftstream.com/?noSignUpCheck=1" -WindowTitle "Please login to Microsoft Stream ..." $Token = $TenantInfo.AccessToken $Headers = @{ "Authorization" = ("Bearer " + $Token) "accept-encoding" = "gzip, deflate, br" } $UrlTenant = $TenantInfo.ApiGatewayUri $ApiVersion = $TenantInfo.ApiGatewayVersion $UrlBase = "$UrlTenant{0}?`$skip=0&`$top=100&adminmode=true&api-version=$ApiVersion" $RequestToken = [PSCustomObject]::new() $RequestToken | Add-Member -Name "token" -MemberType NoteProperty -Value $Token $RequestToken | Add-Member -Name "headers" -MemberType NoteProperty -Value $Headers $RequestToken | Add-Member -Name "tenantInfo" -MemberType NoteProperty -Value $TenantInfo $Urls = [PSCustomObject]::new() $RequestToken | Add-Member -Name "urls" -MemberType NoteProperty -Value $Urls $RequestToken.urls | Add-Member -Name "Videos" -MemberType NoteProperty -Value ($UrlBase -f "videos") $RequestToken.urls | Add-Member -Name "Channels" -MemberType NoteProperty -Value ($UrlBase -f "channels") $RequestToken.urls | Add-Member -Name "Groups" -MemberType NoteProperty -Value ($UrlBase -f "groups") $UrlBase = $UrlBase.replace("`$skip=0&", "") $RequestToken.urls | Add-Member -Name "Principals" -MemberType NoteProperty -Value ($UrlBase -f "principals") write-output $RequestToken } function New-Menu { [CmdletBinding()] param( [Parameter(Mandatory)] [ValidateNotNullOrEmpty()] [string]$Title, [Parameter(Mandatory)] [ValidateNotNullOrEmpty()] [string]$Question ) $videos = [ChoiceDescription]::new('&Videos', 'Videos') $channelvideos = [ChoiceDescription]::new('&ChannelVideos', 'All Videos by Channel') $all = [ChoiceDescription]::new('&All', 'All videos AND all videos by channel') $options = [ChoiceDescription[]]($videos, $channelvideos, $all) $result = $host.ui.PromptForChoice($Title, $Question, $options, 0) switch ($result) { 0 { 'Videos' } 1 { 'ChannelVideos' } 2 { 'All' } } } $menuoutome = New-Menu -title 'Stream videos' -question 'What do you want to output?' #write-host $menuoutome $StreamToken = Get-StreamToken $urlQueryToUse = $StreamToken.Urls.Videos #default $StreamToken.Urls.Videos is something like https://aase-1.api.microsoftstream.com/api/videos?$skip=900&$top=100&adminmode=true&api-version=1.4-private #To get creator and event details you need to add $expand=creator,events to the URL , not to do that you need to use &`$expand=creator,events with out the ` powershell thinks $expand is a variable. # e.g. use $urlQueryToUse = $StreamToken.Urls.Videos+"&`$expand=creator,events" #Other option # use the following if you want to only see files that have privacymode eq 'organization' i.e. video is visible to EVERYONE in the organisation # Thanks to Ryechz for this # # $urlQueryToUse = $StreamToken.Urls.Videos + "&orderby=publishedDate%20desc&`$expand=creator,events&`$filter=published%20and%20(state%20eq%20%27Completed%27%20or%20contentSource%20eq%20%27livestream%27)%20and%20privacymode%20eq%20%27organization%27%20" if($menuoutome -eq 'Videos' -Or $menuoutome -eq 'All'){ #modify the -URL submitted to get more data or filter data or order the output $ExtractData = Get-RequestedAssets -token $StreamToken -Url $urlQueryToUse -Label "Videos" write-host "" write-host "The `$ExtractData object contains all the details about each video. Use `$ExtractData[0] to see the first item in the object, and it's properties." if( $OutputCsvFileName ) { $thisOutputCsvFileName = $OutputCsvFileName.replace(".csv", '-'+$menuoutome+'-videos.csv') $ExtractData | Export-CSV $thisOutputCsvFileName -NoTypeInformation -Encoding UTF8 write-host "The following file has been created: $thisOutputCsvFileName" if( $OpenFileWhenComplete ) { Invoke-Item $thisOutputCsvFileName } } } if($menuoutome -eq 'ChannelVideos' -Or $menuoutome -eq 'All'){ #Get the list of channels , filter the result for the channel id and name $channelList = Get-VideoChannels -token $StreamToken -Url $StreamToken.Urls.Channels -Label "Channels" #for each channel get the videos that are in that channel, so that we can match them up to the ExtractData , which is the list of all videos) $videosPerChannel = get-channelvideos -token $StreamToken -allChannels $channelList -Label "ChannelVideos" write-host "" write-host "The `$videosPerChannel object contains key information about each video and the channel it is in. Use `$videosPerChannel[0] to see the first video, and it's properties." write-host "The `$channelList object contains a list of channel's and their properties." if( $OutputCsvFileName ) { $thisOutputCsvFileName = $OutputCsvFileName.replace(".csv", '-'+$menuoutome+'-channelVideos.csv') $videosPerChannel | Export-CSV $thisOutputCsvFileName -NoTypeInformation -Encoding UTF8 write-host "The following file has been created: $thisOutputCsvFileName" if( $OpenFileWhenComplete ) { Invoke-Item $thisOutputCsvFileName } } } >>> Original Code Oct 2020 <<< I've left this here in case it is useful to anyone. See the script above for a better solution. That said this solution relies on you knowing a bit about PowerShell, being a Stream Admin and being happy to manually save some files (I couldn't figure out how to do pass through windows authentication for the script) so you have to manually save each paged JSON file of 100 videos. It took me about 20minutes to export information about 591 videos (about 3 hours to make the script). To use the script update each variable marked with #<<<< Update this value in a normal (not admin) powershell window run the script (i copy and paste logical parts into the powershell window) you will be given several urls, to visit and save the JSON files from once you have the JSON files the final part of the script reads those, and exports a CSV file with a row per video in STREAM NOTE : as an admin you see all videos, so before you share the CSV with others be aware that there may be sensitive information in it that most users can't see due to STREAM's in built security. I don't have much time, hence I made this script so please don't expect quick answers to any questions. This script is rough, use it if it helps, but ... be professional and check it before you use it. ##>> Update 5 Aug 2021 <<## Thanks to everyone who has commented, I've updated the code below with your suggestions You still have to manually save the JSON browser tabs that show up as JSON files into a folder, but other than that I hope it is now easier for you to use 🙂 #reference https://techcommunity.microsoft.com/t5/microsoft-stream-forum/powershell-script-to-audit-and-export-channel-content-details-of/m-p/354832 # goal of this script #- get list of all videos in stream for analysis #- it takes about 20 minutes to do this for 500 stream videos. #First # find out what your api source is # go to the following URL in chrome as an admin of Stream https://web.microsoftstream.com/browse # using Developer tools look at the "console" search for .api.microsoftstream to find out what is between https:// and .api.microsoftstream in my case https://aase-1.api.microsoftstream.com/api/ [string]$rootAPIlocation = "aase-1" #<<<< Update this value to the one you find in the console view #[string]$rootAPIlocation = "uswe-1" # use this for Western US #[string]$rootAPIlocation = "euno-1" # use this for the Europe North region #enter where you on your computer you want the files to go [string]$PowerShellScriptFolder = "C:\Temp\streamanalysis" #<<<< Update this value #json files will be saved into "VideosJSON" folder [string]$streamJSONfolder = Join-Path -Path $PowerShellScriptFolder -ChildPath "VideosJSON" #<<<< Update this value if you want a different folder name #>>> REMOVES all exiisting JSON files <<<< #remove all JSON items in this folder Remove-Item -path $streamJSONfolder\* -include *.json -Force -Recurse #guess approx number of videos you think you have divide by 100 e.g. 9 = 900 videos [int]$Loopnumber = 9 #<<<< Update this value #put in your stream portal url [string]$StreamPortal = "https://web.microsoftstream.com/?NoSignUpCheck=1" #put in the url where you see all videos from in stream [string]$StreamPortalVideoRoot = "https://web.microsoftstream.com/browse/" #$StreamPortalChannelRootForFindingVideos [string]$StreamPortalVideoViewRoot= "https://web.microsoftstream.com/video/" # for watching a video #this builds from the info you've put in a URL which will give back the JSON info about all your videos. [string]$StreamAPIVideos100 = "https://$rootAPIlocation.api.microsoftstream.com/api/videos?NoSignUpCheck=1&`$top=100&`$orderby=publishedDate%20desc&`$expand=creator,events&`$filter=published%20and%20(state%20eq%20%27Completed%27%20or%20contentSource%20eq%20%27livestream%27)&adminmode=true&api-version=1.4-private&`$skip=0" #$StreamAPIVideos100 # use the following if you want to only see files that have privacymode eq 'organization' i.e. video is visible to EVERYONE in the organisation #Thanks to Ryechz for this # # [string]$StreamAPIVideos100 = "https://$rootAPIlocation.api.microsoftstream.com/api/videos?NoSignUpCheck=1&`$top=100&`$orderby=publishedDate%20desc&`$expand=creator,events&`$filter=published%20and%20(state%20eq%20%27Completed%27%20or%20contentSource%20eq%20%27livestream%27)%20and%20privacymode%20eq%20%27organization%27%20&adminmode=true&api-version=1.4-private&`$skip=0" [int]$skipCounter [int]$skipCounterNext = $skipCounter+100 [string]$fileName = "jsonfor-$skipCounter-to-$skipCounterNext.json" #next section creates the URLS you need to manually download the json from , it was too hard to figure out how to do this programatically with authentication. Write-Host " Starting Chrome Enter your credentials to load O365 Stream portal" -ForegroundColor Magenta #Thanks Conrad Murray for this tip Start-Process -FilePath 'chrome.exe' -ArgumentList $StreamPortal Read-Host -Prompt "Press Enter to continue ...." Write-host " -----------------------------------------" -ForegroundColor Green Write-host " --Copy and past each url into chrome-----" -ForegroundColor Green Write-host " --save JSON output into $streamJSONfolder" -ForegroundColor Green for($i=0;$i -lt $Loopnumber; $i++) { $skipCounter = $i*100 if($skipCounter -eq 0) { write-host $StreamAPIVideos100 Start-Process -FilePath 'chrome.exe' -ArgumentList $StreamAPIVideos100 } else { write-host $StreamAPIVideos100.replace("skip=0","skip=$skipCounter") #following code opens browser tabs for each of the jsonfiles #Thanks Conrad Murray for this tip Start-Process -FilePath 'chrome.exe' -ArgumentList $StreamAPIVideos100.replace("skip=0","skip=$skipCounter") } } Write-host " --save each browser window showing JSON output into $streamJSONfolder" -ForegroundColor Green Write-host " -----------------------------------------------------------------------------------" -ForegroundColor Green Write-host " -----------------------------------------" -ForegroundColor Green Read-Host -Prompt "Press Enter to continue ...." Write-host " -----------------------------------------" -ForegroundColor Green $JSONFiles = Get-ChildItem -Path $streamJSONfolder -Recurse -Include *.json [int]$videoscounter = 0 $VideosjsonAggregateddata=@() $data=@() foreach($fileItem in $JSONFiles) { Write-host " -----------------------------------------" -ForegroundColor Green Write-Host " =====>>>> getting content of JSON File:", $fileItem, "- Path:", $fileItem.FullName -ForegroundColor Yellow $Videosjsondata = Get-Content -Raw -Path $fileItem.FullName | ConvertFrom-Json $VideosjsonAggregateddata += $Videosjsondata Write-host " -----------------------------------------" -ForegroundColor Green #Write-Host " =====>>>> Channel JSON Raw data:", $Videosjsondata -ForegroundColor green #Read-Host -Prompt "Press Enter to continue ...." } write-host "You have " $VideosjsonAggregateddata.value.count " videos in Stream , using these selection criteria" foreach($myVideo in $VideosjsonAggregateddata.value) { $videoscounter += 1 $datum = New-Object -TypeName PSObject Write-host " -----------------------------------------" -ForegroundColor Green Write-Host " =====>>>> Video (N°", $videoscounter ,") ID:", $myVideo.id -ForegroundColor green Write-Host " =====>>>> Video Name:", $myVideo.name," created:", $myVideo.created,"- modified:", $myVideo.modified -ForegroundColor green Write-Host " =====>>>> Video Metrics views:", $myVideo.metrics.views, "- comments:", $myVideo.metrics.comments -ForegroundColor Magenta Write-Host " =====>>>> Video Creator Name: ", $myVideo.creator.name , " - Email:", $myVideo.creator.mail -ForegroundColor Magenta Write-Host " =====>>>> Video Description: ", $myVideo.description -ForegroundColor Magenta $datum | Add-Member -MemberType NoteProperty -Name VideoID -Value $myVideo.id $datum | Add-Member -MemberType NoteProperty -Name VideoName -Value $myVideo.name $datum | Add-Member -MemberType NoteProperty -Name VideoURL -Value $($StreamPortalVideoViewRoot + $myVideo.id) $datum | Add-Member -MemberType NoteProperty -Name VideoCreatorName -Value $myVideo.creator.name $datum | Add-Member -MemberType NoteProperty -Name VideoCreatorEmail -Value $myVideo.creator.mail $datum | Add-Member -MemberType NoteProperty -Name VideoCreationDate -Value $myVideo.created $datum | Add-Member -MemberType NoteProperty -Name VideoModificationDate -Value $myVideo.modified $datum | Add-Member -MemberType NoteProperty -Name VideoLikes -Value $myVideo.metrics.likes $datum | Add-Member -MemberType NoteProperty -Name VideoViews -Value $myVideo.metrics.views $datum | Add-Member -MemberType NoteProperty -Name VideoComments -Value $myVideo.metrics.comments #the userData value is for the user running the JSON query i.e. did that user view this video. It isn't for information about all users who may have seen this video. There seems to be no information about that other than, total views = metrics.views #$datum | Add-Member -MemberType NoteProperty -Name VideoComments -Value $myVideo.userData.isViewed $datum | Add-Member -MemberType NoteProperty -Name Videodescription -Value $myVideo.description #thanks Johnathan Ogden for these values $datum | Add-Member -MemberType NoteProperty -Name VideoDuration -Value $myVideo.media.duration $datum | Add-Member -MemberType NoteProperty -Name VideoHeight -Value $myVideo.media.height $datum | Add-Member -MemberType NoteProperty -Name VideoWidth -Value $myVideo.media.width $datum | Add-Member -MemberType NoteProperty -Name VideoIsAudioOnly -Value $myVideo.media.isAudioOnly $datum | Add-Member -MemberType NoteProperty -Name VideoContentType -Value $myVideo.contentType $data += $datum } $datestring = (get-date).ToString("yyyyMMdd-hhmm") $csvfileName = ($PowerShellScriptFolder + "\O365StreamVideoDetails_" + $datestring + ".csv") #<<<< Update this value if you want a different file name Write-host " -----------------------------------------" -ForegroundColor Green Write-Host (" >>> writing to file {0}" -f $csvfileName) -ForegroundColor Green $data | Export-csv $csvfileName -NoTypeInformation Write-host " ------------------ DONE -----------------------" -ForegroundColor Green Disclaimer : You can use that solution as you want and modify it depending of your case. Many thanks to Fromelard and his https://techcommunity.microsoft.com/t5/microsoft-stream-forum/powershell-script-to-audit-and-export-channel-content-details-of/m-p/354832 which gave me enough to figure out how to do this.SolvedPowerShell script to audit and export Channel content details of your Office 365 Stream
Following the previous script developed to audit Office 365 Video Portal: https://techcommunity.microsoft.com/t5/Office-365-Video/PowerShell-script-to-audit-and-export-all-content-details-of/m-p/352594#M830 I worked to retrieve something equivalent with Office 365 Stream. There is not yet public API available into Office 365 Stream, MS communicated that API delivery Q2 or Q3 2019. But O365 Stream portal (https://web.microsoftstream.com) is using internal API used into Stream internal pages like the following examples: List of O365 Groups: Page: https://web.microsoftstream.com/browse?view=group API: https://euno-1.api.microsoftstream.com/api/groups?xxx List of videos for one O365 Group: Page: https://web.microsoftstream.com/group/{groupid}?view=videos API: https://euno-1.api.microsoftstream.com/api/groups/{groupid}/videos?xxx List of Channels Page: https://web.microsoftstream.com/browse?view=channel API: https://euno-1.api.microsoftstream.com/api/channels?xxx Those API, giving results in JSON format, can be used only if you connect O365 Stream first to start your web session, and because that solution is only to wait official Public API, I decided to stay in that Web Browser usage mode. The process to respect step by step (steps are placed into the PowerShell script): Open O365 Stream Portal into your Web Browser (with your admin credentials) - https://web.microsoftstream.com/?NoSignUpCheck=1 With the same Web browser (other tab or other windows), open the following links: FROM 0 to 100: https://euno-1.api.microsoftstream.com/api/channels?$top=100&$orderby=metrics%2Fvideos%20desc&$expand=creator,group&api-version=1.3-private&$skip=0 FROM 100 to 200: https://euno-1.api.microsoftstream.com/api/channels?$top=100&$orderby=metrics%2Fvideos%20desc&$expand=creator,group&api-version=1.3-private&$skip=100 ... : Stop when you have no value into the page Save each of that JSON Results into the work folder (C:\PSScript\ChannelsJSON\) channels100.json channels200.json … PowerShell script will generate an Aggregated CSV file containing O365 Stream Channel details Before execute the script, you have to set the parameters with the correct value (depending on total channels you have into your O365 Stream and where you place the PowerShell script): [int]$Loopnumber = 2 # << 2 indicates between 100 and 200 channels [string]$PowerShellScriptFolder = "C:\PSScript" [string]$PowerShellScriptFolder = "C:\PSScript" [string]$streamJSONfolder = Join-Path -Path $PowerShellScriptFolder -ChildPath "ChannelsJSON" Remove-Item -path $streamJSONfolder\* -include *.json -Force -Recurse [string]$StreamPortal = "https://web.microsoftstream.com/?NoSignUpCheck=1" [string]$StreamPortalChannelRoot = "https://web.microsoftstream.com/channel/" [string]$StreamAPIChannels100 = "https://euno-1.api.microsoftstream.com/api/channels?NoSignUpCheck=1&`$top=100&`$orderby=metrics%2Fvideos%20desc&`$expand=creator,group&api-version=1.3-private&`$skip=" [int]$Loopnumber = 2 Write-host " -----------------------------------------" -ForegroundColor Green Write-Host " =====>>>> PortalURL:", $StreamPortal Start-Process -FilePath 'iexplore.exe' -ArgumentList $StreamPortal Write-Host " Enter your credentials to load O365 Stream portal" -ForegroundColor Magenta Read-Host -Prompt "Press Enter to continue ...." for($i=0;$i -lt $Loopnumber; $i++) { Write-host " -----------------------------------------" -ForegroundColor Green $StreamAPIChannels100 = $StreamAPIChannels100 + $($i*100) Write-Host " =====>>>> First 100 channels (from", $($i*100), "to", $(($i+1)*100), "):", $StreamAPIChannels100 Start-Process -FilePath 'iexplore.exe' -ArgumentList $StreamAPIChannels100 Write-Host " Save the 100 channels (from", $($i*100), "to", $(($i+1)*100), ") into the folder $streamJSONfolder respecting the name channels100.json" -ForegroundColor Magenta Read-Host -Prompt "Press Enter to continue ...." } Write-host " -----------------------------------------" -ForegroundColor Green $ChannelJSONFiles = Get-ChildItem -Path $streamJSONfolder -Recurse -Include *.json [int]$channelscounter = 0 $ChanneljsonAggregateddata=@() $data=@() foreach($channelsjson in $ChannelJSONFiles) { Write-host " -----------------------------------------" -ForegroundColor Green Write-Host " =====>>>> JSON File:", $channelsjson, "- Path:", $channelsjson.FullName -ForegroundColor Yellow $Channeljsondata = Get-Content -Raw -Path $channelsjson.FullName | ConvertFrom-Json $ChanneljsonAggregateddata += $Channeljsondata Write-host " -----------------------------------------" -ForegroundColor Green #Write-Host " =====>>>> Channel JSON Raw data:", $Channeljsondata -ForegroundColor green #Read-Host -Prompt "Press Enter to continue ...." } foreach($myChannel in $ChanneljsonAggregateddata.value) { if($myChannel.metrics.videos -gt -1) { $channelscounter += 1 $datum = New-Object -TypeName PSObject Write-host " -----------------------------------------" -ForegroundColor Green Write-Host " =====>>>> Channel (N°", $channelscounter ,") ID:", $myChannel.id, "- isDefault Channel:", $myChannel.isDefault -ForegroundColor green Write-Host " ---- Channel Name:", $myChannel.name,"- Channel Portal URL:", $($StreamPortalChannelRoot + $myChannel.id) Write-Host " ---- Channel CreationDate:", $myChannel.created,"- Channel ModificationDate:", $myChannel.modified Write-Host " =====>>>> Channel Metrics Followers:", $myChannel.metrics.follows, "- Video Total:", $myChannel.metrics.videos -ForegroundColor Magenta Write-Host " =====>>>> O365 Channel Creator Name: ", $myChannel.creator.name , " - Email:", $myChannel.creator.mail -ForegroundColor Magenta Write-Host " O365 GROUP Name:", $myChannel.group.name, "- ID:", $myChannel.group.id -ForegroundColor Yellow Write-Host " =====>>>> O365 Group ID: ", $myChannel.group.id , " - Group Email:", $myChannel.group.aadGroup.mail -ForegroundColor Magenta Write-Host " =====>>>> O365 Group Metrics Channel total:", $myChannel.group.metrics.channels, "- Video Total:", $myChannel.group.metrics.videos -ForegroundColor Magenta $datum | Add-Member -MemberType NoteProperty -Name ChannelID -Value $myChannel.id $datum | Add-Member -MemberType NoteProperty -Name ChannelName -Value $myChannel.name $datum | Add-Member -MemberType NoteProperty -Name ChannelURL -Value $($StreamPortalChannelRoot + $myChannel.id) $datum | Add-Member -MemberType NoteProperty -Name ChannelDefault -Value $myChannel.isDefault $datum | Add-Member -MemberType NoteProperty -Name ChannelFollowers -Value $myChannel.metrics.follows $datum | Add-Member -MemberType NoteProperty -Name ChannelVideos -Value $myChannel.metrics.videos $datum | Add-Member -MemberType NoteProperty -Name ChannelCreatorName -Value $myChannel.creator.name $datum | Add-Member -MemberType NoteProperty -Name ChannelCreatorEmail -Value $myChannel.creator.mail $datum | Add-Member -MemberType NoteProperty -Name ChannelCreationDate -Value $myChannel.created $datum | Add-Member -MemberType NoteProperty -Name ChannelModificationDate -Value $myChannel.modified $datum | Add-Member -MemberType NoteProperty -Name O365GroupId -Value $myChannel.group.id $datum | Add-Member -MemberType NoteProperty -Name O365GroupName -Value $myChannel.group.name $datum | Add-Member -MemberType NoteProperty -Name O365GroupEmail -Value $myChannel.group.aadGroup.mail $datum | Add-Member -MemberType NoteProperty -Name O365GroupTotalChannels -Value $myChannel.group.metrics.channels $datum | Add-Member -MemberType NoteProperty -Name O365GroupTotalVideos -Value $myChannel.group.metrics.videos $data += $datum } } $datestring = (get-date).ToString("yyyyMMdd-hhmm") $fileName = ($PowerShellScriptFolder + "\O365StreamDetails_" + $datestring + ".csv") Write-host " -----------------------------------------" -ForegroundColor Green Write-Host (" >>> writing to file {0}" -f $fileName) -ForegroundColor Green $data | Export-csv $fileName -NoTypeInformation Write-host " -----------------------------------------" -ForegroundColor Green You can use that solution as you want and modify it depending of your case and request. Fabrice Romelard French version: http://blogs.developpeur.org/fabrice69/archive/2019/02/21/office-365-script-powershell-pour-auditer-le-contenu-de-son-office-365-stream-portal.aspx Update March 22 2019: Stream root URL used for Internal API can be different based on tenant region. I collected some URLs as much as I can (if you have anyone new, please to add it in comment to update that list) Europe: https://euno-1.api.microsoftstream.com Asia: https://aase-1.api.microsoftstream.comIncreased security visibility through new Standard Logs in Microsoft Purview Audit
In response to increasing frequency and evolution of cyberthreats, Microsoft is providing access to wider cloud security logs to its worldwide customers at no additional cost. Audit (Standard) customers can now access these additional logs, which have been identified as a result of close coordination with commercial and government customers, and with the Cybersecurity and Infrastructure Security Agency (CISA).Introducing the Microsoft Purview Audit Search Graph API
The new Microsoft Purview Audit Search Graph API will enable the programmatic search and retrieval of relevant audit logs with improvements in search completeness, reliability, and performance. This API serves as an improved alternative to the existing PowerShell cmdlet, Search-UnifiedAuditLog.PowerShell script to export SharePoint Usage in CSV format used to Audit an Office 365 Tenant
As I published the Exchange Audit script last time: PowerShell script to export Exchange Usage in CSV format used to Audit an Office 365 Tenant The question is similar for SharePoint part of Office 365 tenant. So I wrote that script to export in CSV format a figure about SharePoint usage into an Office 365 Tenant. It will loop by Site collection and, depending of the permission, subsites to extract size, volume, customization (for the old SP versions). [boolean]$DebugGlobalMode = $True #$False [string]$username = "Admin@yourtenant.onmicrosoft.com" [string]$PwdTXTPath = "C:\SECUREDPWD\ExportedPWD-$($username).txt" [string]$CSVFolderReport = "C:\SHAREPOINT\Reports\" [string]$AdminTenantURL = "https://YourTenant-admin.sharepoint.com" function Load-DLLandAssemblies { [string]$defaultDLLPath = "" # Load assemblies to PowerShell session $defaultDLLPath = "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.dll" [System.Reflection.Assembly]::LoadFile($defaultDLLPath) $defaultDLLPath = "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.Runtime.dll" [System.Reflection.Assembly]::LoadFile($defaultDLLPath) $defaultDLLPath = "C:\Program Files\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.Online.SharePoint.Client.Tenant.dll" [System.Reflection.Assembly]::LoadFile($defaultDLLPath) } function Get-SPOWebs(){ param( $Url = $(throw "Please provide a Site Collection Url"), $Credential = $(throw "Please provide a Credentials") ) $context = New-Object Microsoft.SharePoint.Client.ClientContext($Url) $context.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Credential.UserName,$Credential.Password) $context.RequestTimeout = 1000000 # milliseconds $web = $context.Web $context.Load($web) $context.Load($web.Webs) $context.ExecuteQuery() foreach($myweb in $web.Webs) { Get-SPOWebs -Url $myweb.Url -Credential $Credential $myweb } } function Check-InfoPath-Usage($myspoWebSite, $myspcontext) { [string]$InfotpathStatus = "" [boolean]$DebugMode = $false #$DebugGlobalMode $AllspwebLists = $myspoWebSite.lists $myspcontext.Load($AllspwebLists) $myspcontext.ExecuteQuery() Write-Host " ---------------------------------------------- " if($DebugMode) {Write-Host " -->InfoPath:",$($myspoWebSite.Id), "-",$($myspoWebSite.Url), "-",$($myspoWebSite.Title) -ForegroundColor Yellow} foreach($myList in $AllspwebLists) { $myspcontext.Load($myList) $myspcontext.ExecuteQuery() $listTitle = $myList.Title $listType = $myList.BaseTemplate $listUrl = $myList.DefaultViewUrl try { if($DebugMode) {Write-Host " -->Infopath: List Check", $listTitle, "(", $listType, ") at WebURL", $myspoWebSite.url -ForegroundColor Green} if($listType -eq 100 -or $listType -eq 101) { if($DebugMode) {Write-Host " -->Infopath: Line 70 - listType:", $listType} $isSysList = $myList.IsSystemList $IswebCatalog = $myList.IsCatalog $IsAppList = $myList.IsApplicationList $listForms = $myList.Forms $myspcontext.Load($listForms) $myspcontext.ExecuteQuery() if($DebugMode) {Write-Host " -->Infopath: Line 77 - isSysList:", $isSysList} if($DebugMode) {Write-Host " -->Infopath: Line 78 - IsCatalog:", $IswebCatalog} if($DebugMode) {Write-Host " -->Infopath: Line 79 - IsApplicationList:", $IsAppList} if($isSysList -or $IswebCatalog -or $IsAppList) { if($DebugMode) {Write-Host " -->Infopath: System, Application or Catalog List Ignore", $listTitle, "at URL", $myspoWebSite.url -ForegroundColor Yellow} } else { if($listType -eq 101) { if($DebugMode) {Write-Host " -->Infopath: Line 88 - listType:", $listType} if($myList.AllowContentTypes) { if($DebugMode) {Write-Host " -->Infopath: Line 89 - AllowContentTypes:", $myList.AllowContentTypes} $contentTyps = $myList.ContentTypes $myspcontext.Load($contentTyps) $myspcontext.ExecuteQuery() forEach($contType in $contentTyps) { if($DebugMode) {Write-Host " -->Infopath: Line 97 - contType.Name:", $contType.Name} if($contType.Name -eq "Form") { Write-Host " -->InfoPath: Found in Library", $listTitle, "at URL", $myspoWebSite.url -ForegroundColor Magenta $InfotpathStatus += "Infopath:"+ $myspoWebSite.url +";" } } } if($DebugMode) {Write-Host " -->Infopath: Line 105 - listType:", $listType} } else { forEach($listFm in $listForms) { $listPath = $listFm.ServerRelativeUrl if($DebugMode) {Write-Host " -->Infopath: Line 112 - listPath:", $listPath} if ($listPath -like '*displayifs.aspx') { Write-Host " -->InfoPath: Found in List", $listTitle, "at URL", $myspoWebSite.url -ForegroundColor Magenta $InfotpathStatus += "Infopath:"+ $myspoWebSite.url +";" } } } } } } catch { Write-Host " -->Infopath: Error Check for list:", $listTitle -ForegroundColor Red Write-Host " ErrorMessage:", $_.Exception -ForegroundColor Red } } return $InfotpathStatus } function Check-SPWorkflow($myspoWebSite, $myspcontext) { [string]$WorkflowStatus = "" [boolean]$DebugMode = $false #$DebugGlobalMode $AllspwebLists = $myspoWebSite.lists $myspcontext.Load($AllspwebLists) $myspcontext.ExecuteQuery() if($DebugMode) {Write-Host " --> WorkFlow: ",$($myspoWebSite.Id), "-",$($myspoWebSite.Url), "-",$($myspoWebSite.Title) -ForegroundColor Yellow} foreach($list in $AllspwebLists) { if($DebugMode) {Write-Host " -->SPWorkflow: List Check", $list.Title, " at WebURL", $myspoWebSite.url -ForegroundColor Green} try { $myspcontext.Load($list.WorkflowAssociations) $myspcontext.ExecuteQuery() foreach($wfAssociation in $list.WorkflowAssociations) { if($DebugMode) {Write-Host " -->SPWorkflow: List ", $list.Title, "- Wokflow:", $wfAssociation.Name -ForegroundColor Magenta} $WorkflowStatus += "`"$($list.Title)`",`"$($wfAssociation.Name)`",`"$($wfAssociation.TaskListTitle)`"," #$WorkflowStatus += "`"$($wfAssociation.HistoryListTitle)`",$($wfAssociation.Created),$($wfAssociation.Modified)" } } catch { Write-Host " -->WorkFlowCHeck: Error Check for list:", $list.Title -ForegroundColor Red Write-Host " ErrorMessage:", $_.Exception -ForegroundColor Red } } return $WorkflowStatus } cls Write-Host " ---------------------------------------------- " Load-DLLandAssemblies Write-Host " ---------------------------------------------- " $secureStringPwd = ConvertTo-SecureString -string (Get-Content $PwdTXTPath) $adminCreds = New-Object System.Management.Automation.PSCredential $username, $secureStringPwd #$adminCreds = get-credential Connect-SPOService -Url $AdminTenantURL -credential $adminCreds -ErrorAction SilentlyContinue -ErrorVariable Err $data = @() #Retrieve all site collection infos #$sitesInfo = Get-SPOSite -Limit 10 | Sort-Object -Property url | Select * #$sitesInfo = Get-SPOSite -Template "STS#0" -Limit 10 | Sort-Object -Property url | Select * $sitesInfo = Get-SPOSite -Limit ALL | Sort-Object -Property url | Select * [int]$i = 1; [string]$CheckInfoPathStatus = "" [string]$CheckWorkFlowStatus = "" $data = @() Write-Host "--------------------------------------------------------------------------------------------" #Retrieve and print all sites foreach ($site in $sitesInfo) { Write-Host "SiteColl Number:", $i, "- of:", $sitesInfo.Count; $i += 1; $RootSiteCreatedDate = get-date "1900-01-01" try { $Rootcontext = New-Object Microsoft.SharePoint.Client.ClientContext($site.Url) $Rootcontext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($adminCreds.UserName,$adminCreds.Password) $Rootcontext.RequestTimeout = 1000000 # milliseconds $RootWeb = $Rootcontext.web $Rootcontext.Load($RootWeb) $Rootcontext.ExecuteQuery() $RootSiteCreatedDate = $RootWeb.Created $CheckInfoPathStatus = Check-InfoPath-Usage $RootWeb $Rootcontext $CheckWorkFlowStatus = Check-SPWorkflow $RootWeb $Rootcontext } catch { Write-host " =====>>>> Impossible to get the RootSite " -ForegroundColor Red Write-host " =====>>>> RootSite:", $site.Url -ForegroundColor Yellow } Write-Host "SPO Site collection:", $site.Url, "- Title:", $site.Title Write-Host " => Creation Date:", $RootSiteCreatedDate, "- LastItemModifiedDate", $site.LastContentModifiedDate Write-Host " => External Sharing:", $site.SharingCapability Write-Host " => Site Template Used:", $site.Template Write-Host " => Storage Quota:", $site.StorageQuota Write-Host " => Storage used:", $site.StorageUsageCurrent Write-Host " => Storage Warning Level:", $site.StorageQuotaWarningLevel Write-Host " => Resource Quota:", $site.ResourceQuota, "- Resource used:", $site.ResourceUsageCurrent $SuborRootSite = "RootSite" $data += @( [pscustomobject]@{ SiteCollectionURL = $site.Url SiteCollectionTitle = $site.Title SPType = $site.Template SubsiteURL = $site.Url SuborRootSite = $SuborRootSite WebTemplate = $site.Template WebCreationDate = $RootSiteCreatedDate LastItemModifiedDate = $site.LastContentModifiedDate ExternalSharingCapability = $site.SharingCapability StorageQuotaMB = $site.StorageQuota StorageUsageCurrentMB = $site.StorageUsageCurrent StorageQuotaWarningLevelMB = $site.StorageQuotaWarningLevel ResourceQuota = $site.ResourceQuota ResourceUsageCurrent = $site.ResourceUsageCurrent DevCustomCreated = "" DevCustomSPWorkflow = $CheckWorkFlowStatus DevSPFxCreated = "" DevMSFloworPowerAppsCreated = "" DevInforpathForm = $CheckInfoPathStatus } ) try { $AllWebs = Get-SPOWebs -Url $site.Url -Credential $adminCreds if($DebugMode) {$AllWebs | %{ Write-Host " >>", $_.Title, "-", $_.Url}} Write-Host "--------------------------------------------------------------------------------------------" foreach($mySPWeb in $AllWebs) { Write-Host " >> Subsite:", $mySPWeb.Url -ForegroundColor magenta $CheckInfoPathStatus = Check-InfoPath-Usage $RootWeb $Rootcontext $CheckWorkFlowStatus = Check-SPWorkflow $RootWeb $Rootcontext $SuborRootSite = "SubSite" $data += @( [pscustomobject]@{ SiteCollectionURL = $site.Url SiteCollectionTitle = $site.Title SPType = $site.Template SubsiteURL = $mySPWeb.Url SuborRootSite = $SuborRootSite WebTemplate = $mySPWeb.WebTemplate WebCreationDate = $mySPWeb.Created LastItemModifiedDate = $mySPWeb.LastItemModifiedDate ExternalSharingCapability = $site.SharingCapability StorageQuotaMB = $site.StorageQuota StorageUsageCurrentMB = $site.StorageUsageCurrent StorageQuotaWarningLevelMB = $site.StorageQuotaWarningLevel ResourceQuota = $site.ResourceQuota ResourceUsageCurrent = $site.ResourceUsageCurrent DevCustomCreated = "" DevCustomSPWorkflow = $CheckWorkFlowStatus DevSPFxCreated = "" DevMSFloworPowerAppsCreated = "" DevInforpathForm = $CheckInfoPathStatus } ) } } catch { Write-host " =====>>>> Impossible to get the Subsites " -ForegroundColor Red Write-host " =====>>>> RootSite:", $site.Url -ForegroundColor Yellow } } #Write-Host $data $datestring = (get-date).ToString("yyyyMMdd-hhmm") $CSVFileToExport = Join-Path -Path $CSVFolderReport -ChildPath $("SharePoint_"+ $datestring + ".csv") Write-host " -----------------------------------------" -ForegroundColor Green Write-Host (" >>> writing to file {0}" -f $CSVFileToExport) -ForegroundColor Green $data | Export-csv $CSVFileToExport -NoTypeInformation -enc utf8 Write-host " -----------------------------------------" -ForegroundColor Green You can adapt that script as you need, based on your own requirements Fabrice Romelard French version: http://blogs.developpeur.org/fabrice69/archive/2019/02/27/office-365-script-powershell-pour-auditer-l-usage-de-sharepoint-online-de-votre-tenant.aspx Source used: http://sharepoint.stackexchange.com/questions/86842/extract-all-list-names-from-sharepoint-site-to-csv https://collab365.community/forum/topics/find-all-sharepoint-online-sites-with-an-infopath-form-and-determine-if-still-being-used/ https://sharepoint.stackexchange.com/questions/240149/get-the-list-of-all-available-infopath-forms-in-a-sharepoint-site-collection-s https://gallery.technet.microsoft.com/office/SharePoint-Online-Get-all-40d76baa8.6KViews2likes0CommentsEnterprise-grade controls for AI apps and agents built with Azure AI Foundry and Copilot Studio
AI innovation is moving faster than ever, and more AI projects are moving beyond experimentation into deployment, to drive tangible business impact. As organizations accelerate innovation with custom AI applications and agents, new risks emerge across the software development lifecycle and AI stack related to data oversharing and leaks, new vulnerabilities and threats, and non-compliance with stringent regulatory requirements Through 2025, poisoning of software supply chains and infrastructure technology stacks will constitute more than 70% of malicious attacks against AI used in the enterprise 1 , highlighting potential threats that originate early in development. Today, the average cost of a data breach is $4.88 million, but when security issues are caught early in the development process, that number drops dramatically to just $80 per incident 2 . The message is very clear; security can’t be an afterthought anymore. It must be a team sport across the organization, embedded from the start and throughout the development lifecycle. That's why developers and security teams should align on processes and tools that bring security into every stage of the AI development lifecycle and give security practitioners visibility into and the ability to mitigate risks. To address these growing challenges and help customers secure and govern their AI workloads across development and security teams, we are: Enabling Azure AI Foundry and Microsoft Copilot Studio to provide best-in-class foundational capabilities to secure and govern AI workloads Deeply integrating and embedding industry-leading capabilities from Microsoft Purview, Microsoft Defender, and Microsoft Entra into Azure AI Foundry and Microsoft Copilot Studio This week, 3,000 developers are gathering in Seattle for the annual Microsoft Build conference, with many more tuning in online, to learn practical skills for accelerating their AI apps and agents' innovation. To support their AI innovation journey, today we are excited to announce several new capabilities to help developers and organizations secure and govern AI apps and agents. New Azure AI Foundry foundational capabilities to secure and govern AI workloads Azure AI Foundry enhancements for AI security and safety With 70,000 customers, 100 trillion tokens processed this quarter, and 2 billion enterprise search queries each day, Azure AI Foundry has grown beyond just an application layer—it's now a comprehensive platform for building agents that can plan, take action, and continuously learn to drive real business outcomes. To help organizations build and deploy AI with confidence, we’re introducing new security and safety capabilities and insights for developers in Azure AI Foundry Introducing Spotlighting to detect and block prompt injection attacks in real time As AI systems increasingly rely on external data sources, a new class of threats has emerged. Indirect prompt injection attacks embed hidden instructions in documents, emails, and web content, tricking models into taking unauthorized actions without any direct user input. These attacks are difficult to detect and hard to prevent using traditional filters alone. To address this, Azure AI Content Safety is introducing Spotlighting, now available in preview. Spotlighting strengthens the Prompt Shields guardrail by improving its ability to detect and handle potential indirect prompt injections, where hidden adversarial instructions are embedded in external content. This new capability helps prevent the model from inadvertently acting on malicious prompts that are not directly visible to the user. Enable Spotlighting in Azure AI Content Safety to detect potential indirect prompt injection attacks New capabilities for task adherence evaluation and task adherence mitigation to ensure agents remain within scope As developers build more capable agents, organizations face growing pressure to help confirm those agents act within defined instructions and policy boundaries. Even small deviations can lead to tool misuse, broken workflows, or risks like unintended exposure of sensitive data. To solve this, Azure AI Foundry now includes task adherence for agents, now in preview and powered by two components: a real-time evaluation and a new control within Azure AI Content Safety. At the core is a real-time task adherence evaluation API, part of Azure AI Content Safety. This API assesses whether an agent’s behavior is aligned with its assigned task by analyzing the user’s query, system instructions, planned tool calls, and the agent’s response. The evaluation framework is built on Microsoft’s Agent Evaluators, which measure intent resolution, tool selection accuracy, completeness of response, and overall alignment to the original request. Developers can run this scoring logic locally using the Task Adherence Evaluator in the Azure AI Evaluation SDK, with a five-point scale that ranges from fully nonadherent to fully adherent. This gives teams a flexible and transparent way to inspect task-level behavior before it causes downstream issues. Task adherence is enforced through a new control in Azure AI Content Safety. If an agent goes off-task, the control can block tool use, pause execution, or trigger human review. In Azure AI Agent Service, it is available as an opt-in feature and runs automatically. Combined with real-time evaluation, this control helps to ensure that agents stay on task, follow instructions, and operate according to enterprise policies. Learn more about Prompt Shields in Azure AI Content Safety. Azure AI Foundry continuous evaluation and monitoring of agentic systems Maintaining high performance and compliance for AI agents after deployment is a growing challenge. Without ongoing oversight, issues like performance degradation, safety risks, or unintentional misuse of resources can slip through unnoticed. To address this, Azure AI Foundry introduces continuous evaluation and monitoring of agentic systems, now in preview, provides a single pane of glass dashboard to track key metrics such as performance, quality, safety, and resource usage in real time. Continuous evaluation runs quality and safety evaluations at a sampled rate of production usage with results made available in the Azure AI Foundry Monitoring dashboard and published to Application Insights. Developers can set alerts to detect drift or regressions and use Azure Monitor to gain full-stack visibility into their AI systems. For example, an organization using an AI agent to assist with customer-facing tasks can monitor groundedness and detect a decline in quality when the agent begins referencing irrelevant information, helping teams to act before it potentially negatively affects trust of users. Azure AI Foundry evaluation integrations with Microsoft Purview Compliance Manager, Credo AI, and Saidot for streamlined compliance AI regulations and standards introduce new requirements for transparency, documentation, and risk management for high-risk AI systems. As developers build AI applications and agents, they may need guidance and tools to help them evaluate risks based on these requirements and seamlessly share control and evaluation insights with compliance and risk teams. Today, we are announcing previews for Azure AI Foundry evaluation tool’s integration with a compliance management solution, Microsoft Purview Compliance Manager, and AI governance solutions, Credo AI and Saidot. These integrations help define risk parameters, run suggested compliance evaluations, and collect evidence for control testing and auditing. For example, for a developer who’s building an AI agent in Europe may be required by their compliance team to complete a Data Protection Impact Assets (DPIA) and Algorithmic Impact Assessment (AIA) to meet internal risk management and technical documentation requirements aligned with emerging AI governance standards and best practices. Based on Purview Compliance Manager’s step-by-step guidance on controls implementation and testing, the compliance teams can evaluate risks such as potential bias, cybersecurity vulnerabilities, or lack of transparency in model behavior. Once the evaluation is conducted in Azure AI Foundry, the developer can obtain a report with documented risk, mitigation, and residual risk for compliance teams to upload to Compliance Manager to support audits and provide evidence to regulators or external stakeholders. Assess controls for Azure AI Foundry against emerging AI governance standards Learn more about Purview Compliance Manager. Learn more about the integration with Credo AI and Saidot in this blogpost. Leading Microsoft Entra, Defender and Purview value extended to Azure AI Foundry and Microsoft Copilot Studio Introducing Microsoft Entra Agent ID to help address agent sprawl and manage agent identity Organizations are rapidly building their own AI agents, leading to agent sprawl and a lack of centralized visibility and management. Security teams often struggle to keep up, unable to see which agents exist and whether they introduce security or compliance risks. Without proper oversight, agent sprawl increases the attack surface and makes it harder to manage these non-human identities. To address this challenge, we’re announcing the public preview of Microsoft Entra Agent ID, a new capability in the Microsoft Entra admin center that gives security admins visibility and control over AI agents built with Copilot Studio and Azure AI Foundry. With Microsoft Entra Agent ID, an agent created through Copilot Studio or Azure AI Foundry is automatically assigned an identity with no additional work required from the developers building them. This is the first step in a broader initiative to manage and protect non-human identities as organizations continue to build AI agents. : Security and identity admins can gain visibility into AI agents built in Copilot Studio and Azure AI Foundry in the Microsoft Entra Admin Center This new capability lays the foundation for more advanced capabilities coming soon to Microsoft Entra. We also know that no one can do it alone. Security has always been a team sport, and that’s especially true as we enter this new era of protecting AI agents and their identities. We’re energized by the momentum across the industry; two weeks ago, we announced support for the Agent-to-Agent (A2A) protocol and began collaborating with partners to shape the future of AI identity workflows. Today, we’re also excited to announce new partnerships with ServiceNow and Workday. As part of this, we’ll integrate Microsoft Entra Agent ID with the ServiceNow AI Platform and the Workday Agent System of Record. This will allow for automated provisioning of identities for future digital employees. Learn more about Microsoft Entra Agent ID. Microsoft Defender security alerts and recommendations now available in Azure AI Foundry As more AI applications are deployed to production, organizations need to predict and prevent potential AI threats with natively integrated security controls backed by industry-leading Gen AI and threat intelligence for AI deployments. Developers need critical signals from security teams to effectively mitigate security risks related to their AI deployments. When these critical signals live in separate systems outside the developer experience, this can create delays in mitigation, leaving opportunities for AI apps and agents to become liabilities and exposing organizations to various threats and compliance violations. Now in preview, Microsoft Defender for Cloud integrates AI security posture management recommendations and runtime threat protection alerts directly into the Azure AI Foundry portal. These capabilities, previously announced as part of the broader Microsoft Defender for Cloud solution, are extended natively into Azure AI Foundry enabling developers to access alerts and recommendations without leaving their workflows. This provides real-time visibility into security risks, misconfigurations, and active threats targeting their AI applications on specific Azure AI projects, without needing to switch tools or wait on security teams to provide details. Security insights from Microsoft Defender for Cloud help developers identify and respond to threats like jailbreak attacks, sensitive data leakage, and misuse of system resources. These insights include: AI security posture recommendations that identify misconfigurations and vulnerabilities in AI services and provide best practices to reduce risk Threat protection alerts for AI services that notify developers of active threats and provide guidance for mitigation, across more than 15 detection types For example, a developer building an AI-powered agent can receive security recommendations suggesting the use of Azure Private Link for Azure AI Services resources. This reduces the risk of data leakage by handling the connectivity between consumers and services over the Azure backbone network. Each recommendation includes actionable remediation steps, helping teams identify and mitigate risks in both pre- and post-deployment phases. This helps to reduce risks without slowing down innovation. : Developers can view security alerts on the Risks + alerts page in Azure AI Foundry : Developers can view recommendations on the Guardrails + controls page in Azure AI Foundry This integration is currently in preview and will be generally available in June 2025 in Azure AI Foundry. Learn more about protecting AI services with Microsoft Defender for Cloud. Microsoft Purview capabilities extended to secure and govern data in custom-built AI apps and agents Data oversharing and leakage are among the top concerns for AI adoption, and central to many regulatory requirements. For organizations to confidently deploy AI applications and agents, both low code and pro code developers need a seamless way to embed security and compliance controls into their AI creations. Without simple, developer-friendly solutions, security gaps can quickly become blockers, delaying deployment and increasing risks as applications move from development to production. Today, Purview is extending its enterprise-grade data security and compliance capabilities, making it easier for both low code and pro code developers to integrate data security and compliance into their AI applications and agents, regardless of which tools or platforms they use. For example, with this update, Microsoft Purview DSPM for AI becomes the one place data security teams can see all the data risk insights across Microsoft Copilots, agents built in Agent Builder and Copilot Studio, and custom AI apps and agents built in Azure AI Foundry and other platforms. Admins can easily drill into security and compliance insights for specific AI apps or agents, making it easier to investigate and take action on potential risks. : Data security admins can now find data security and compliance insights across Microsoft Copilots, agents built with Agent Builder and Copilot Studio, and custom AI apps and agents in Microsoft Purview DSPM for AI In the following sections, we will provide more details about the updates to Purview capabilities in various AI workloads. 1. Microsoft Purview data security and compliance controls can be extended to any custom-built AI application and agent via the new Purview SDK or the native Purview integration with Azure AI Foundry. The new capabilities make it easy and effortless for security teams to bring the same enterprise-grade data security compliance controls available today for Microsoft 365 Copilot to custom AI applications and agents, so organizations can: Discover data security risks, such as sensitive data in user prompts, and data compliance risks, such as harmful content, and get recommended actions to mitigate risks proactively in Microsoft Purview Data Security Posture Management (DSPM) for AI. Protect sensitive data against data leakage and insider risks with Microsoft Purview data security policies. Govern AI interactions with Audit, Data Lifecycle Management, eDiscovery, and Communication Compliance. Microsoft Purview SDK Microsoft Purview now offers Purview SDK, a set of REST APIs, documentation, and code samples, currently in preview, enabling developers to integrate Purview's data security and compliance capabilities into AI applications or agents within any integrated development environment (IDE). : By embedding Purview APIs into the IDE, developers help enable their AI apps to be secured and governed at runtime For example, a developer building an AI agent using an AWS model can use the Purview SDK to enable their AI app to automatically identify and block sensitive data entered by users before it’s exposed to the model, while also providing security teams with valuable signals that support compliance. With Purview SDK, startups, ISVs, and partners can now embed Purview industry-leading capabilities directly into their AI software solutions, making these solutions Purview aware and easier for their customers to secure and govern data in their AI solutions. For example, Infosys Vice President and Delivery Head of Cyber Security Practice, Ashish Adhvaryu indicates, “Infosys Cyber Next platform integrates Microsoft Purview to provide enhanced AI security capabilities. Our solution, the Cyber Next AI assistant (Cyber Advisor) for the SOC analyst, leverages Purview SDK to drive proactive threat mitigation with real-time monitoring and auditing capabilities. This integration provides holistic AI-assisted protection, enhancing cybersecurity posture." Microsoft partner EY (previously known as Ernst and Young) has also leveraged the new Purview SDK to embed Purview value into their GenAI initiatives. “We’re not just building AI tools, we are creating Agentic solutions where trust, security, and transparency are present from the start, supported by the policy controls provided through the Purview SDK. We’re seeing 25 to 30 percent time savings when we build secure features using the Purview SDK,” noted Sumanta Kar, Partner, Innovation and Emerging Tech at EY. Learn more about the Purview SDK. Microsoft Purview integrates natively with Azure AI Foundry Organizations are developing an average of 14 custom AI applications. The rapid pace of AI innovation may leave security teams unaware of potential data security and compliance risks within their environments. With the update announced today, Azure AI Foundry signals are now directly integrated with Purview Data Security Posture Management for AI, Insider Risk Management, and data compliance controls, minimizing the need for additional development work. For example, for AI applications and agents built with Azure AI Foundry models, data security teams can gain visibility into AI usage and data risks in Purview DSPM for AI, with no additional work from developers. Data security teams can also detect, investigate, and respond to both malicious and inadvertent user activities, such as a departing employee leveraging an AI agent to retrieve an anomalous amount of sensitive data, with Microsoft Purview Insider Risk Management (IRM) policies. Lastly, user prompts and AI responses in Azure AI apps and agents can now be ingested into Purview compliance tools as mentioned above. Learn more about Microsoft Purview for Azure AI Foundry. 2. Purview data protections extended to Copilot Studio agents grounded in Microsoft Dataverse data Coming to preview in June, Purview Information Protection extends auto-labeling and label inheritance coverage to Dataverse to help prevent oversharing and data leaks. Information Protection makes it easier for organizations to automatically classify and protect sensitive data at scale. A common challenge is that sensitive data often lands in Dataverse from various sources without consistent labeling or protection. The rapid adoption of agents built using Copilot Studio and grounding data from Dataverse increases the risk of data oversharing and leakage if data is not properly protected. With auto-labeling, data stored in Dataverse tables can be automatically labeled based on policies set in Microsoft Purview, regardless of its source. This reduces the need for manual labeling effort and protects sensitive information from the moment it enters Dataverse. With label inheritance, AI agent responses grounded in Dataverse data will automatically carry and honor the source data’s sensitivity label. If a response pulls from multiple tables with different labels, the most restrictive label is applied to ensure consistent protection. For example, a financial advisor building an agent in Copilot Studio might connect multiple Dataverse tables, some labeled as “General” and others as “Highly Confidential.” If a response pulls from both, it will inherit the most restrictive label, in this case, "Highly Confidential,” to prevent unauthorized access and ensure appropriate protections are applied across both maker and users of the agent. Together, auto-labeling and label inheritance in Dataverse support a more secure, automated foundation for AI. : Sensitivity labels will be automatically applied to data in Dataverse : AI-generated responses will inherit and honor the source data’s sensitivity labels Learn more about protecting Dataverse data with Microsoft Purview. 3. Purview DSPM for AI can now provide visibility into unauthenticated interactions with Copilot Studio agents As organizations increasingly use Microsoft Copilot Studio to deploy AI agents for frontline customer interactions, gaining visibility into unauthenticated user interactions and proactively mitigating risks becomes increasingly critical. Building on existing Purview and Copilot Studio integrations, we’ve extended DSPM for AI and Audit in Copilot Studio to provide visibility into unauthenticated interactions, now in preview. This gives organizations a more comprehensive view of AI-related data security risks across authenticated and unauthenticated users. For example, a healthcare provider hosting an external, customer-facing agent assistant must be able to detect and respond to attempts by unauthenticated users to access sensitive patient data. With these new capabilities in DSPM for AI, data security teams can now identify these interactions, assess potential exposure of sensitive data, and act accordingly. Additionally, integration with Purview Audit provides teams with seamless access to information needed for audit requirements. : Gain visibility into all AI interactions, including those from unauthenticated users Learn more about Purview for Copilot Studio. 4. Purview Data Loss Prevention extended to more Microsoft 365 agent scenarios To help organizations prevent data oversharing through AI, at Ignite 2024, we announced that data security admins could prevent Microsoft 365 Copilot from using certain labeled documents as grounding data to generate summaries or responses. Now in preview, this control also extends to agents published in Microsoft 365 Copilot that are grounded by Microsoft 365 data, including pre-built Microsoft 365 agents, agents built with the Agent Builder, and agents built with Copilot Studio. This helps ensure that files containing sensitive content are used appropriately by AI agents. For example, confidential legal documents with highly specific language that could lead to improper guidance if summarized by an AI agent, or "Internal only” documents that shouldn’t be used to generate content that can be shared outside of the organization. : Extend data loss prevention (DLP) policies to Microsoft 365 Copilot agents to protect sensitive data Learn more about Data Loss Prevention for Microsoft 365 Copilot and agents. The data protection capabilities we are extending to agents in Agent Builder and Copilot Studio demonstrate our continued investment in strengthening the Security and Governance pillar of the Copilot Control System (CSS). CCS provides integrated controls to help IT and security teams secure, manage, and monitor Copilot and agents across Microsoft 365, spanning governance, management, and reporting. Learn more here. Explore additional resources As developers and security teams continue to secure AI throughout its lifecycle, it’s important to stay ahead of emerging risks and ensure protection. Microsoft Security provides a range of tools and resources to help you proactively secure AI models, apps, and agents from code to runtime. Explore the following resources to deepen your understanding and strengthen your approach to AI security: Learn more about Security for AI solutions on our webpage Learn more about Microsoft Purview SDK Get started with Azure AI Foundry Get started with Microsoft Entra Get started with Microsoft Purview Get started with Microsoft Defender for Cloud Get started with Microsoft 365 Copilot Get started with Copilot Studio Sign up for a free Microsoft 365 E5 Security Trial and Microsoft Purview Trial 1 Predicts 2025: Navigating Imminent AI Turbulence for Cybersecurity, Jeremy D'Hoinne, Akif Khan, Manuel Acosta, Avivah Litan, Deepak Seth, Bart Willemsen, 10 February 2025 2 IBM. "Cost of a Data Breach 2024: Financial Industry." IBM Think, 13 Aug. 2024, https://www.ibm.com/think/insights/cost-of-a-data-breach-2024-financial-industry; Cser, Tamas. "The Cost of Finding Bugs Later in the SDLC." Functionize, 5 Jan. 2023, https://www.functionize.com/blog/the-cost-of-finding-bugs-later-in-the-sdlc