Forum Widgets
Latest Discussions
Getting Teams meeting transcripts using Powershell with Graph API
I have set up an Enterprise App in Entra with the following API permissions: Microsoft.Graph OnlineMeetings.Read (Delegated) OnlineMeetings.Read.All (Application) User.Read.All (Application) Admin consent has been granted for the Application types. Below is the code snippet for getting the meetings: $tenantId = "xxxxxx" $clientId = "xxxxxx" $clientSecret = "xxxxxx" $secureSecret = ConvertTo-SecureString $clientSecret -AsPlainText -Force $psCredential = New-Object System.Management.Automation.PSCredential ($clientId, $secureSecret) Connect-MgGraph -TenantId $tenantId -ClientSecretCredential $psCredential -NoWelcome $meetings = Get-MgUserOnlineMeeting -UserId "email address removed for privacy reasons" -All Connect-MgGraph is invoked without errors. I had verified this with Get-MgContext command. At line 10 I get this error: Status: 404 (NotFound) ErrorCode: UnknownError I don't know if this is means there was an error in the API call, or there were no records found (I do have Teams calls with transcripts though). I have tried changing the last line to (without -All): $meetings = Get-MgUserOnlineMeeting -UserId "my guid user id here" And I get this error: Status: 403 (Forbidden) ErrorCode: Forbidden Adding -All parameter results in this error: Filter expression expected - /onlineMeetings?$filter={ParameterName} eq '{id}'. I've done some searching but I haven't found any more information nor solution for this. I hope someone can point me in the right direction. Thanks in advance!paulnerieApr 23, 2025Copper Contributor22Views0likes0CommentsGui to deploy folder contents to multiple VMs
I am trying to improve imaging computers where I work. I need to create a gui for new hires since the imaging process is so complicated. I need the GUI to request necessary computer names that are being imaged and then copy files from a local workstation to the machines that are being imaged on the network that our technicians do not have physical access to. I have turned to Powershell for the solution in an attempt to improve on my knowledge which is basic really. Below is the code I have come up with so far. In this code I am getting the location of the file. I would rather copy the entire folder instead of the file but I couldnt find the code to do that. So, if that is possible please show me how. If not I figure I would have to save these imaging files to a ZIP file. Then I could maybe use this GUI I am working on to move the zip file to the remote computers. Add-Type -AssemblyName System.Windows.Forms # Create the form $form = New-Object System.Windows.Forms.Form $form.Text = "File and Network Location Collector" $form.Size = New-Object System.Drawing.Size(400, 200) # Create the label for file name $fileLabel = New-Object System.Windows.Forms.Label $fileLabel.Text = "File Name:" $fileLabel.Location = New-Object System.Drawing.Point(10, 20) $form.Controls.Add($fileLabel) # Create the text box for file name $fileTextBox = New-Object System.Windows.Forms.TextBox $fileTextBox.Location = New-Object System.Drawing.Point(100, 20) $fileTextBox.Size = New-Object System.Drawing.Size(250, 20) $form.Controls.Add($fileTextBox) # Create the label for network location $networkLabel = New-Object System.Windows.Forms.Label $networkLabel.Text = "Network Location:" $networkLabel.Location = New-Object System.Drawing.Point(10, 60) $form.Controls.Add($networkLabel) # Create the text box for network location $networkTextBox = New-Object System.Windows.Forms.TextBox $networkTextBox.Location = New-Object System.Drawing.Point(100, 60) $networkTextBox.Size = New-Object System.Drawing.Size(250, 20) $form.Controls.Add($networkTextBox) # Create the button to submit $submitButton = New-Object System.Windows.Forms.Button $submitButton.Text = "Submit" $submitButton.Location = New-Object System.Drawing.Point(150, 100) $form.Controls.Add($submitButton) # Add event handler for the button click $submitButton.Add_Click({ $fileName = $fileTextBox.Text $networkLocation = $networkTextBox.Text [System.Windows.Forms.MessageBox]::Show("File Name: $fileName`nNetwork Location: $networkLocation") }) # Show the form $form.ShowDialog() In this portion of the code it is copying from one source to many locations. Thank you for any assistance as this would help my organization a lot. We are getting several new hires who are very new to the industry. This would be a huge blessing. Pardon the change in font size. It did that for no reason, its my first time using the blog, and there appears to be no way to change the sizes lol. Forgive me. #Define the source folder and the list of target computers $sourceFolder = "C:\Path\To\SourceFolder" $destinationFolder = "C:\Path\To\DestinationFolder" $computers = @("Computer1", "Computer2", "Computer3") # Replace with actual computer names # Function to copy the folder function Copy-Folder { param ( [string]$source, [string]$destination ) Copy-Item -Path $source -Destination $destination -Recurse -Force } # Execute the copy operation on each computer foreach ($computer in $computers) { Invoke-Command -ComputerName $computer -ScriptBlock { param ($source, $destination) Copy-Folder -source $source -destination $destination } -ArgumentList $sourceFolder, $destinationFolder } Write-Host "Folder copied to all specified computers."techhondoApr 21, 2025Copper Contributor16Views0likes0CommentsActivating a users multiple PIM groups using PowerShell
Hi All, Following on from the implementation of PIM by one of my clients. Due to the large numbers of groups for some staff, i.e. developers etc, we have looked into activating them programmatically. However, this always appears to fall over due to the syntax etc. Whether using Get-MgPrivilegedAccessGroupEligibilityScheduleInstance or Invoke-MgGraphRequest -Method POST -Uri "https://graph.microsoft.com/beta/identityGovernance/privilegedAccess/group/assignments" or New-MgRoleManagementDirectoryRoleAssignmentScheduleRequest. In various scripts, it either falls over intermittently saying '..is not recognised as the name of a cmdlet..etc etc etc. To check whether anyone else has achieved this. I am trying to avoid reworking what they have put in place over the past 3 months or so. Many Thanks MoZZaSolved_MoZZaApr 17, 2025Brass Contributor25Views0likes1CommentGet-MgDeviceAppManagementManagedAppPolicy -ManagedAppPolicyID. How to get the ID?
Hello! I am trying to copy an Intune App Protection Policy so I can edit it and apply it to a different group of users. I've cobbled together the below script from other examples but it doesn't work because I am not able to find the -ManagedAppPolicyID that it wants. I've not been able to find it anywhere in Intune. I've not been able to find a PowerShell cmdlet that will list it either. Does anyone know how I can make this work? Or another way to do it? Install-Module Microsoft.Graph -Scope CurrentUser Connect-MgGraph -Scopes "DeviceManagementApps.ReadWrite.All" $policyId = "<Insert App Policy ID>" $appProtectionPolicy = Get-MgDeviceAppManagementManagedAppPolicy -ManagedAppPolicyId $policyId $newPolicy = $appProtectionPolicy | Select-Object * -ExcludeProperty Id, CreatedDateTime, Version, LastModifiedDateTime $newPolicy.DisplayName = "Copy of $($newPolicy.DisplayName)" New-MgDeviceAppManagementMobileAppConfiguration -Data $newPolicy Get-MgDeviceAppManagementManagedAppPolicy -Filter "displayName eq '$($newPolicy.DisplayName)'"kcelmerApr 15, 2025Copper Contributor36Views0likes2Comments.Net mail message, PowerShell and Microsoft Purview Infrmation Protection
I have a PowerShell script that using the .net mail message to send emails. We want to restrict some of those emails to a certain sensitivity (we call it classification) and restrict it to only internal users (which this label does when sending via Outlook). I have looked at a number of ways to do this but haven't come up with anything that works. Here are the issues: The smtp server is NOT in Office 365. The PowerShell window is opened as an admin account so using an Outlook interface might not work. Currently, I have it set to send remotly (A session is created with the server that is whitelisted and it actually sends the message). Any information would be of great assistance.DFOTAApr 10, 2025Copper Contributor14Views0likes0CommentsPurview -> Powershell
i need to export some users their data before their licenses are removed. It is about 60 users, so i would rather user powershell instead of the purview portal to automate the job. So i have been playing around with the commandlets, to get an idea to build the script. The strange thing is what i see in Powershell is not represented in the Purview portal. We had an older compliance case which was no longer used. I tried to remove the compliance case by the Purview portal, but nothing happens when clicking "delete case" or "close case". i then reverted back to PowerShell by using the Remove-ComplianceCase "$CaseName", where the compliance case was successfully removed. When running the Get-ComplianceCase, i can see that the old compliance case is indeed removed, however the removed compliance case is still present in the Purview portal even several hours after deleting the case with PowerShell. Then started to play around with a new compliance search New-ComplianceSearch -Name "$($TargetMailbox.displayName) License Cleanup" -ExchangeLocation "$($TargetMailbox.PrimarySmtpAddress)" -Case "License Cleanup" -SharePointlocation "$($PNPPersonalSite.url)" after refreshing a couple of times i could see the compliance search in the purview portal. Then started the compliance search by using the Start-ComplianceSeacrh commandlet and verified that the search status was completed: Get-compliancesearch "$($TargetMailbox.displayName) License Cleanup" | select status However in the Purview portal no statistics were shown (not available yet). Didn't spend to much attention as i already saw discrepancies between the purview portal and what i saw in Powershell, so continued exporting compliance search with a compliance search action to export the data in the process manager New-ComplianceSearchAction -SearchName ""$($TargetMailbox.displayName)" -Export Can successfully retrieve the compliancesearch action in Powershell and can see that the status is completed, but fail to retrieve the export in the purview portal. Get-ComplianceSearchAction -case "License CleanUp" -includecredential | fl Did not achieve a way in downloading the export results via PowerShell, but would already be pretty pleased if i could achieve the first two steps via PowerShell. But as i am unable to retrieve the export in the Purview portal, i am afraid that i am still stuck. I can create an export in the Purview portal from the compliance search i created in Powershell. Can anyone please explain me the issue with the discrepancies between what i see in PowerShell and the Purview Portal and is it possible to see the exports created in powershell in the purview portal? And is it feasible to download the export from Powershell as well (Start-Process)?TherealKillerbeApr 03, 2025Brass Contributor51Views0likes0CommentsRemove computers from multiple domains from one AD group
Hello! I've tried a couple of scripts I found and still cannot remove these computers from this one AD group. The script I'm currently using is: # Import the Active Directory module Import-Module ActiveDirectory # List of device names to be removed $computers = @( "machine1.domain1", "machine2.domain2" ) # Loop through each device name in the list foreach ($computer in $computers) { # Get the device object from Active Directory $computer = Get-ADComputer -Identity $computer -ErrorAction SilentlyContinue # Check if the device exists if ($computer) { # Remove the device from Active Directory get-adcomputer $computer | remove-adobject -recursive -Confirm:$false Write-Host "Removed device $computer from Active Directory." } else { Write-Host "Device $computer not found in Active Directory." } } The errors I get is the object cannot be found and it always lists Domain1. I'm pretty new to PS so would appreciate any guidance!jmaravigliaApr 02, 2025Copper Contributor82Views0likes7CommentsMGraph suddenly stops working
PS C:\Windows> Get-MGUser -All Get-MGUser : InteractiveBrowserCredential authentication failed: In Zeile:1 Zeichen:1 + Get-MGUser -All + ~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [Get-MgUser_List], AuthenticationFailedException + FullyQualifiedErrorId : Microsoft.Graph.PowerShell.Cmdlets.GetMgUser_List Prior to this I did a "connect-mgraph -Scopes "User.Read.All" " and authenticated myself with MFA. Did not get an error doing so. Logged in as a global administrator. Any ideas what i going wrong? I know, the error indicates Authentication Failure, but Authentication looks correctheinzelrumpelMar 28, 2025Brass Contributor54Views0likes2CommentsAdding AD users to a specific security group
Hi Everyone, Sorry if this question has already been asked as I couldn't find an answer. I’m trying to write a PowerShell script that runs as a scheduled task to add AD users to a specific AD security group. The goal is for this to run daily. The script will first check the users' OU to determine if they are already members of the security group. If they are, it will skip them; if they are not members, it will add them to the group. I have created the following script, but I’m unsure if it's the best approach. Additionally, can this script be executed on a server that doesn’t have Active Directory installed? If AD must be installed, would it be ideal to run it on a Domain Controller? # Check if Active Directory module is already imported, import only if necessary if (-not (Get-Module -Name ActiveDirectory)) { Import-Module ActiveDirectory } # Define the base OU and security group $BaseOU = "OU=W11_USERS,DC=W11,DC=NET" $SecurityGroup = "HR" # Get all users from W11_USERS and its sub-OUs $Users = Get-ADUser -SearchBase $BaseOU -SearchScope Subtree -Filter * # Loop through each user and check group membership before adding foreach ($User in $Users) { $UserDN = $User.DistinguishedName # Check if user is already a member of HR $IsMember = Get-ADGroupMember -Identity $SecurityGroup | Where-Object { $_.DistinguishedName -eq $UserDN } if (-not $IsMember) { Try { Add-ADGroupMember -Identity $SecurityGroup -Members $User -ErrorAction Stop Write-Host "Added $($User.SamAccountName) to $SecurityGroup" -ForegroundColor Green } Catch { Write-Host "Failed to add $($User.SamAccountName): $_" -ForegroundColor Red } } else { Write-Host "$($User.SamAccountName) is already a member of $SecurityGroup" -ForegroundColor Yellow } } Write-Host "User addition process completed."UC_451435Mar 26, 2025Copper Contributor92Views0likes1CommentBeginners performance tip: Use pipelines.
Hi folks, In my role, I see a lot of poorly-written PowerShell code spanning a lot of different contexts. Without fail, the most common failing I see is that the script simply won't scale, meaning performance will decrease significantly when it's run against larger data sets. And within this context, one of the biggest reasons is the overuse of variables and the underutilisation of the PowerShell pipeline. If you're the investigative type, here's some official documentation and training on the pipeline: about_Pipelines - PowerShell | Microsoft Learn Understand the Windows PowerShell pipeline - Training | Microsoft Learn A short explanation is that piping occurs when the output from one command is automatically sent to and used by another command. As an example, let's say I want my first command to fetch all the files in my temporary directory (just the root in this case). I might run a command like the following: Get-ChildItem -Path $env:TEMP -File Which, as you'd expect, produces a list of files. Where PowerShell differs from the old command prompt (or DOS prompt, if you're old like me) is that it's not simply a bunch of text written to the screen. Instead, each of these files is an object. If you don't know what an object is, think of it as a school lunchbox for the time being, where that lunchbox contains a bunch of useful stuff (just data; nothing tasty, sadly) inside. Because this isn't just a bunch of useless text, we can take the individual lunchboxes (objects) produced from this first command and send those to another command. As the second command sees each lunchbox, it can choose to do something with it - or even just ignore it; the possibilities are endless! When the lunchboxes coming out of the first command travel to the second command, the pathway they travel along is the pipeline! It's what joins the two commands together. Continuing the example above, I now want to remove those lunchboxes - I mean files - from my temporary directory, which I'll do by piping the lunchboxes from the first command into a second command that will perform the deleting. Get-ChildItem -Path $env:TEMP -File | Remove-Item -Confirm:$false -ErrorAction:SilentlyContinue Now, there's no output to show for this command but it does pretty much what you'd expect: Deletes the files. Now, there's another way we could have achieved the same thing using variables and loops, which I'll demonstrate first before circling back to how this relates to performance and scalability. # Get all the files first and assign them to a variable. $MyTempFiles = Get-ChildItem -Path $env:TEMP -File; # Now we have a single variable holding all files, send all those files (objects) to the second command to delete them. $MyTempFiles | Remove-Item -Confirm:$false -ErrorAction:SilentlyContinue; This isn't the only way you you can go about it, and you can see I'm still using piping in the second command - but none of this is important. The important point - and this brings us back to the topic of performance - is that I've assigned all of the files to a variable instead of simply passing them over the pipeline. This means that all of these files consume memory and continue to do so until I get rid of the variable (named $MyTempFiles). Now, imagine that instead of dealing with a few hundred files in my temp directory, I'm dealing with 400,000 user objects from an Azure Active Directory tenant and I'll retrieving all attributes. The difference in memory usage is incomparable. And when Windows starts feeling memory pressure, this impacts disk caching performance and before you know it, the Event Log system starts throwing performance events everywhere. It's not a good outcome. So, the more objects you have, the more your performance decreases in a linear manner. Pipeline to the rescue! This conversation is deliberately simple and doesn't go into the internal mechanics of how many commands you might like using actually work, but only relevant part I want to focus on is something called paging. Let's say you use Get-MgBetaUser to pull down those 400,000 users from Azure Active Directory. Get-MgBetaUser (Microsoft.Graph.Beta.Users) | Microsoft Learn Internally, the command won't be pulling them down all at once. Instead, it will pull down a bite-sized chunk (i.e. a page, as evidenced by the ability to specify a value for the PageSize parameter that features in the above documentation) and push that out onto the pipeline. And if you are piping from Get-MgBetaUser to a second command, then that second command can read that set of users from the pipeline to do what it needs to do. And so on through any other commands until eventually there are no more commands left. At this stage - and this is where the memory efficiency comes in - that batch of users can be released from memory as they are no longer needed by anything. In pictures, and using a page size of 1,000, this looks like: Now, as anyone familiar with .NET can attest to, memory isn't actually released immediately by default. The .NET engine manages memory resourcing and monitoring internally but the key takeaway is by using the pipeline, we're allowing the early release of memory to occur. Conversely, when we store everything in a variable, we're preventing the .NET memory manager from releasing memory early. This, in turn, leads to the above-mentioned performance issues. In pictures, this looks like: Is there real benefit? Yes, absolutely. And it can be quite significant, too. In one case, I triaged a script that was causing system failure (Windows PowerShell has/had a default process limit of 2 GB) through storing Azure results in a variable. After some minor tweaks so that it used the pipeline, process memory fluctuated between 250 MB to 400 MB. Working with pipelines typically doesn't require any extra effort - and in some cases can actually condense your code. However, the performance and scalability benefits can potentially be quite significant - particularly on already-busy systems. Cheers, LainLainRobertsonMar 20, 2025Silver Contributor94Views2likes0Comments
Resources
Tags
- Windows PowerShell1,160 Topics
- powershell336 Topics
- office 365274 Topics
- azure active directory140 Topics
- sharepoint128 Topics
- Windows Server127 Topics
- azure96 Topics
- exchange92 Topics
- community54 Topics
- Azure Automation48 Topics