Apr 27 2017 04:13 AM
Hi,
Disclaimer: I am new to PowerShell, hence why I turn here for your input.
Background:
I'm creating a Power BI dashboard based on data exported from the O365 Audit Log. For the moment, I'm not using the recently launched API, but a daily scheduled PowerShell script to export the data. Everything works fine with one exception:
Issue:
The maximum number of objects that can be returned in one query is 5000 (see article for details). We have a fairly small tenant, so I have divided the cmdlet to run twice to cover 24h (AM+PM), that gives us 10k rows which usually is enough. Now the exception is when we run SharePoint backups via Metalogix. The operation is of course walking through every single file which results in logs well above 10k rows.
Solution?
What I want to achieve is to exclude the user account associated with the backup operation, to only return "relevant" objects/events. I was hoping to archive this by using the parameters provided in the cmdlet, but with my limited knowledge I can't figure it out. I can pass user ID's to include, but would this also allow me to exclude by user ID? If so, how?
I can always divide the cmdlet to run more than twice per 24h, allowing more objects to be returned, but I hope there is a better solution to this.
Many thanks!
Jun 04 2019 01:50 PM
@Pontus T Hello, I love your script its exactly what we have been trying to accomplish; however, I have found a problem. It seems that many records are duplicated and even though i am sorting the records in the command by creation date, they seem to be out of order. Its almost as if the file is appended about 100 records at a time and i notice the creation dates jumbled. for example the creation dates will be:
5/20/2019 0:19 |
5/19/2019 21:23 |
5/16/2019 22:40 |
5/16/2019 20:34 |
5/16/2019 20:30 |
5/16/2019 20:30 |
5/16/2019 20:30 |
5/16/2019 12:28 |
5/16/2019 12:26 |
5/24/2019 21:00 |
5/24/2019 20:59 |
5/24/2019 20:59 |
5/24/2019 20:58 |
5/24/2019 20:58 |
5/24/2019 20:57 |
This wouldn't be so bad as I can sort the data in Excel, but because there are duplicate records it seems there may be some overlap during the append.
Jun 05 2019 06:27 AM
Jun 05 2019 07:33 AM
@Rajiv ChokshiThanks Rajiv, that script functions much as the one in this post functions, however it has some more functionality with the logging that I like. However, there is still an issue with duplicate records and I see that the append function still seems to write the data out of order. When I look at creation times they are broken up; the numbers below show how the appending is broken. The creation times should all be written sequentially. Any idea why this is not happening? does it need a separate flag or pipe to fix this?
10
9
8
7
6
5
4
3
2
1
18
17
16
15
14
13
12
11
25
24
23
22
21
20
19
Jun 05 2019 08:08 AM
Apr 30 2024 07:34 PM - edited Apr 30 2024 07:51 PM
Here is my approach to solve this problem, I had something alike and wanted to share it with you, there was a lot of chatter on one specific parameter rendering the 5000 limit useless, within the 24 hours that is, so I created a 4 hour iteration ignoring the bogus parameter, hopes it helps you.
you could add a group of operation restrictions by using "-in" operator if you have a bunch.
#