First published on TECHNET on Dec 14, 2017
Hi folks,
Here I am again. This time I will be writing my second post around HTTP Data Collector, more in particular on how to make that data ingestion automatic.
There are several methods that can be used. Basically, depending on your need and environment you could use:
Background:
I implemented a script to ingest data using HTTP Data Collector. This data represents a custom Log Analytics field, that maps to the Organizational Unit (OU) a given computer belongs to.
In the example below, you see a part of the script which is used to retrieve and push the OU information:
I want to later use this data to create computer groups in Azure Log Analytics based on the OU. Now, since there is a data retention policy in Azure Log Analytics, I need to ingest this data on a given frequency: say once per day.
As listed above, I can use the Task Scheduler in my Windows Operating System, to create a task which executes my PowerShell script or, since I have System Center Operations Manager monitoring all my computers I would go for it.
Implementation:
I will focus the discussion on SCOM, since the first 2 methods have been discussed in several other blogs and Task Scheduler is something we all know very well. Moreover, I personally do not recommend any approach which involves manual configuration or action to be carried on or applied on more than one target.
At this point the question is: How can I teach SCOM to retrieve and send data?
First, note that this approach works also for agents which have not been configured to report to Azure Log Analytics as long as they have Internet access. Second, the "secret" ingredient that you need is just a "collection rule" which run a PowerShell script (mine or any other script of your choice) and that's it.
I will continue my discussion, with the assumption that you have dealt with Management Pack Authoring before. If not, you will appreciate the following links:
I built my data ingestion MP using a fragment from Kevin Holman's SCOM Management Pack VSAE Fragment Library . I used the Timed PowerShell Rule (identified by the Rule.TimedScript.PowerShell.WithParams.mpx file. Since it was just a Proof of Concepts (PoC) I created the MP directly using an XML editor, doing a bit of copy/paste.
Before importing the sample MP, let's have a look at the key parts:
And passed them through, this way (you'll need to replace with your WorkspaceID and PrimaryKey :(
With all that said, it should be easier to go and create rules to upload data to Azure Log Analytics. Anyway, attached to the post, I am providing you with my sample MP which includes both RegistryKey and OU based rules. As I normally do, the rules come disabled by default.
Lesson Learned:
Thanks,
Bruno.
HTTPUploadDataToLogAnalytics.Addendum.zip
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.