Azure Monitor: Gain Observability On Your DHCP Server
Published Jul 24 2023 09:00 AM 4,449 Views
Microsoft

[2023-July-31]: The previous limitation has been resolved. I modified the PowerShell script to update the table name in the workbook file inheriting the value passed as parameter. Make sure you use the latest updated attachment.

 

[2023-July-27]: To avoid workbook issues, make sure you call the the Custom table DHCPLOG_CL (using the correct case). If you prefer another name, then you have to edit the workbook code to point to the correct table.

 

Hello readers,

It is common that customers need to expand the observability over the entire IT infrastructure (see Azure Monitor: Expanding the Out-of-the-Box Observability for your IT Infrastructure). This includes one of the requests I got to gain observability over DHCP servers. More in details, a customer of mine wanted to have a sort of dashboard to show DHCP events with the ability to do an easy search.

After 5 minutes of brainstorming I got the solution in mind: I needed to ingest DHCP logs into Azure Monitor, storing them in a Log Analytics workspace and visualize the data through Azure Workbooks. Looks complicated? It is not, but let us go step by step:

 

#1: Ingesting logs into Azure Monitor:

This is not something difficult, you can follow the Collect text logs with Azure Monitor Agent documentation or you can read ahead to see how I tried to make life easier by using templates and scripts to manage everything in one go. Reason I created the script is just to avoid too many steps to be completed manually. I preferred to have one single script to run which does the following:

  1. Import the table definition schema
  2. Create the table in Log Analytics
  3. If table creation was successful, it creates the necessary DCE and DCR using a JSON template

This approach does not require any customization since all the necessary info are asked as parameters during the script execution. However, you might need to import a different set of info so consider the following:

  1. The table definition schema template includes only meaningful field. If you need less, more or all the fields in the original log, you need to adapt the table schema.
  2. The transformation rule has been defined according to the table schema. If you change the table schema, make sure you adapt the transformation rule as well template prior to running the script.

 

With that said, let us try it:

  1. Open a PowerShell prompt and launch the script with no parameters:

BrunoGabrielli_10-1688495639401.png

 

  1. The script will ask for Azure Active Directory Tenant ID This is required to correctly scope the authentication:

BrunoGabrielli_11-1688495639405.png

 

  1. Once the Azure Active Directory Tenant ID has been entered the script will move on with the authentication asking to pick an account from one of the recently used or to enter a new one with the corresponding password:

BrunoGabrielli_12-1688495639409.png

 

  1. As the next step, a grid will show up, allowing you to select the subscription you want to use:

BrunoGabrielli_13-1688495639411.png

 

  1. From this point on, specific information will be asked like:
    1. The Resource Group containing the Log Analytics workspace to be used
    2. The name of the Log Analytics workspace that will host the custom table
    3. A name for the Custom table
    4. A name for the DCE
    5. A name for the DCR
    6. The name of the Json template file containing DCE and DCR definitions

BrunoGabrielli_14-1688495639414.png

 

  1. The execution will continue, letting you know about the step and the outcome. If everything goes smoothly you will get the following results

BrunoGabrielli_15-1688495639418.png

 

At this point you just need to associate the above created DCR with the DHCP server(s) making sure to set the endpoint to the above created DCE

 

BrunoGabrielli_16-1688495639421.png

 

Easy enough, isn’t it :xd:

You can find the script, the table schema template and the DCE+DCR template files attached to the post.

 

#2: Visualize data through Azure Workbooks:

This step is not deadly difficult as well. The documentation for Creating an Azure Workbook or to use Azure Workbooks templates is there. Together with it there’s also the documentation to the various Azure Workbooks data sources that can be used in a workbook as well as the supported Workbook visualizations.

 

But you know me by now! I love to make my readers’ life easier, so I am going to add a ready-to-use workbook as part of this post. This first version includes tiles with aggregated information on events by DHCP server, events by Event Id and event by description

 

BrunoGabrielli_17-1688495639423.png

 

 

As well as a grid with all log entries with a search box on top.

BrunoGabrielli_18-1688495639429.png

 

The search box is a superb feature of the workbooks. It searches against everything showing up in the grid. Really amazing!!!

 

It goes without saying that this solution applies to both Azure virtual machines and Arc-Enabled servers.

 

Happy observing :smile:

 

Disclaimer

The sample scripts are not supported under any Microsoft standard support program or service. The sample scripts are provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the sample scripts and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the scripts be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the sample scripts or documentation, even if Microsoft has been advised of the possibility of such damages.

Co-Authors
Version history
Last update:
‎Aug 04 2023 12:41 AM
Updated by: