Blog Post

Microsoft Sentinel Blog
21 MIN READ

Automating Watchlists in Microsoft Sentinel

AryaG's avatar
AryaG
Icon for Microsoft rankMicrosoft
Jul 14, 2025

Allowing you to update watchlists from a Blob Storage where public access has been restricted.

Introduction

A huge thank you to MariaSousaValadas​ for contributing and reviewing this post

In a SIEM you may need to upload data for correlation, such as high value assets, IP ranges from your offices, information about terminated employees, etc. This data is usually in external files.

In Microsoft Sentinel, this can be easily done through Watchlists, a feature that allows uploading local CSV files or CSV files stored in a storage account. While you can do this from the portal, you can also automate it using APIs. To read more about watchlists, check Watchlists in Microsoft Sentinel - Microsoft Sentinel | Microsoft Learn.

Due to the sensitivity of your data, it is possible that you want to protect it in a storage account with limited Internet access. In this scenario, you will not be able to simply upload using the Defender portal. Instead, you can use an Azure function that will read from it and then upload it to Microsoft Sentinel in an automated fashion. In this blog post, we will show you how you can do this using a script we are providing.

However, in some use cases, watchlists are not as flexible and we might hit some limitations that might require us to use a different solution. This is because watchlists have a few limitations that you should read about in the first place, but to keep it simple, we are recommending always using watchlists since it is a built-in feature with many UI options (such as editing or adding new items from the portal), unless:

  • You want to use multiple columns for correlation (if you use watchlists, the use of the SearchKey column is highly recommended).
  • You think you'll hit the limit of 10 million items: you can have up to 10 million active watchlist items across all watchlists in a workspace. Deleted items don't count.
  • You often delete items from your source file. The script we are sharing in this article will refresh your watchlist every time you run. However, items that are deleted in your source file will not be automatically removed from your watchlist.

To resolve these workarounds, we also provide an alternative version where you will be able to store your lists in a custom table.

This blog post contains:

  • Prerequisites to setup this flow
  • Configuration of Azure Function
  • Configuration of Storage Account
  • Azure Function code and explanation
  • Local debugging setup
  • Deployment of Azure Function

At the end of the article, we will detail the specific steps for custom logs. The full code of this blog post is available in Github, scroll down to the summary section to see the full code.

 

Prerequisites

 

To start, you need to make sure you have at least the following setup:

 

Setting up Azure Function

 

We will start with creating the Azure Function. This allows us to create the necessary identities and network requirements for the app that we can then use in the subsequent steps. We won't deep dive into the code just yet.

 

When creating an Azure Function, it is important that you choose a plan that supports Virtual Networking. This allows us to disable unrestricted public access to our storage accounts.

 

 

 

In the basic tab, make sure that the stack is adjusted to Python as runtime.

 

 

I also recommend enabling Application Insights under the monitoring tab as this will allow you to be able to quickly diagnose any issues.

 

 

 

The next section to configure is Networking. In this tab, select "Enable virtual network integration" and "Enable VNet integration" under the Outbound Access section.  You will have to choose which Virtual Network and Subnet to use by your Azure Function to integrate with other services. We can then use the VNet to configure our Storage Account to allow access only from this VNet.

 

 

Go ahead and finish the creation of the Function. Once the function is created, we only need to create a managed identity for this Function. This will be required later to allow the Azure Function to read from the storage account. Open your function, go to settings and turn on the Managed identity under "Identity".

 

 

 

Lastly, we are going to add some environment variables that we will later use in our code. This makes your code more secure as it will not use any hard-coded values. Navigate to "Environment variables", and add the following variables. The easiest way to do this is to use the "Advanced edit" and to add these as one JSON string and adjust the values. Make sure that you do not override the already existing variables as they are necessary for the Azure Function to work.

    {
        "name": "WATCHLIST_NAME",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "AZURE_SUBSCRIPTION_ID",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "RESOURCE_GROUP_NAME",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "WORKSPACE_NAME",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "FILE_NAME",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "STORAGE_ACCOUNT_NAME",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "STORAGE_CONTAINER_NAME",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "WATCHLIST_SEARCH_KEY",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "WATCHLIST_PROVIDER",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "WATCHLIST_DESCRIPTION",
        "value": "LAB",
        "slotSetting": false
    }

 

  • WATCHLIST_NAME: The name of the watchlist to be updated or created.
  • AZURE_SUBSCRIPTION_ID: The subscription ID for the Azure account.
  • RESOURCE_GROUP_NAME: The name of the resource group in Azure which houses your workspace.
  • WORKSPACE_NAME: The name of the workspace in Microsoft Sentinel.
  • FILE_NAME: The name of the file to be read from the storage account.
  • STORAGE_ACCOUNT_NAME: The name of the Azure storage account.
  • STORAGE_CONTAINER_NAME: The name of the container within the storage account.
  • WATCHLIST_SEARCH_KEY: The search key to be used in your watchlist
  • WATCHLIST_PROVIDER: A name that will be set as your watchlist provider
  • WATCHLIST_DESCRIPTION: A description for the watchlist to be uploaded

 

If you want to use custom tables, add the following variables as well

[
    {
        "name": "DCE_URL",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "DCR_RULE_ID",
        "value": "LAB",
        "slotSetting": false
    },
    {
        "name": "STREAM_NAME",
        "value": "LAB",
        "slotSetting": false
    }
]

 

 

As a last step of configuration, make sure that the managed identity of the Azure Function has Sentinel contributor permissions on your workspace so that it can update the watchlist. 

In the next section we will configure our storage account.

Configuring Storage Account

 

You can create a new storage account, but if you already have an existing one, you can use that one as well. When you create a Azure Function, this will automatically also create a storage account to store logs and deployment files, you can also use that storage account. There are two things we need to configure; one is to allow incoming connections from our virtual network and the other is to allow the managed identity from the Function to read the files in a specific container.

For the first step, open the storage account and navigate to Networking under "Security + networking". Here, select the option "Enabled from selected virtual networks and IP addresses" under Public network access. Under Virtual networks, click "Add existing virtual network" and select the virtual network that you added in the previous section. Make sure to also select the right subnet that you selected in the previous section.

 

 

The result should look something like this:

 

 

 

I also recommend adding your own IP address in the Firewall section. This allows us to test the Function later locally without the need to always deploy the Function first. Don't forget to save the changes you made.

 

 

This effectively disallows public access while allowing our functions to still reach the storage account. If you want to disable the option to use SAS token, you can read more on this page Prevent authorization with Shared Key - Azure Storage.

 

The second configuration piece is to allow the Managed Identity of our Azure Function to read the files within a container. For this open the container where the files will be stored. Open your container and navigate to the IAM section. There, add a new role assignment. Look for either Storage Blob Data Reader or Storage Blob Data Contributor. If you only need to read the files, then Data Reader should be sufficient. If you want test your code locally, you can also add these permissions to your own user account.

 

 

 

 

In the members section add your Azure Function, it should show up when you select "Managed Identity" as an option.

 

 

We’re now ready to deploy the code. I will be using Visual Studio Code to walk you through the remaining sections.

 

Configure your Function

 

In the following sections, we will create a local workspace where we can start implementing our function trigger, configure our requirements and connect to our Storage Account and read the data. To achieve this, we will create some helper functions. Afterwards, we will push this data to the watchlist API.

Create a trigger

 

We are going to create a timer trigger. This will allow you to periodically read the content of your CSV file and update your watchlist. In VS Code, make sure you have the Azure and Azure Functions extensions installed and connected. If you need help with that, you can follow this link Configure Visual Studio Code for Azure development with .NET - .NET and Develop Azure Functions by using Visual Studio Code to connect your VS Code to your function.

Once your Function is connected to VS Code, we are going to create a Timer Trigger. For this, open the Azure Function extension, in workspace pane (bottom pane), right click on the little Function icon and click "Create Function...".

 

 

 

This will give you a new pop-up that lets you choose the location to store your Function, the language and runtime and type of trigger. Select the location where you want to store it, then select Python and, finally, select Timer Trigger. We could also opt for blob triggers, that would automatically trigger the function every time there is an update to our blob.

 

 

 

Provide the name for your function

 

 

 

And provide a CRON expression on how often this function should be run. If you are not sure what to put there, have a look at this page which explains the format Timer trigger for Azure Functions. So, for example, if we want to run the function every day at 02:00, we will use this expression: 0 0 2 * * *

 

 

Libraries

Our code is going to use the libraries provided by Microsoft to authenticate as well as sending data to the APIs. This makes it much easier in terms of connecting and maintaining changes to the REST APIs. In the local folder you selected to create your function, please check if there is a "requirements.txt" file and, if not, go ahead and create it. This will allow the Python package manager to fetch the requirements in the file and install it for us.

Update the content requirements.txt file by adding the lines below, then save the file. We are fetching the libraries for:

  • Security Insights (azure.mgmt.securityinsight) that will let us update the watchlist
  • Blob Storage (azure.storage.blob) that will let us read the content of our CSV file
  • Identity (azure.identity) that will let us authenticate to the above-mentioned services
  • Ingestion (azure.monitor.ingestion), this is optional but allows us to write to a custom table if you want to use the alternative to watchlists

 

azure.mgmt.securityinsight 
azure.storage.blob 
azure.identity 
azure.monitor.ingestion

 

Then, run this command in a terminal so that the packages are fetched and installed locally. I recommend that you use a virtual environment for your project, otherwise these packages will be installed globally, for more information see Install packages in a virtual environment using pip and venv - Python Packaging User Guide. When you test your code locally, the Core Tools will automatically look for a virtual environment as well.

pip install -r requirements.txt

 

This will fetch the packages and install them for us. Next, we are going to import these packages in our code. So, open the file where your function code resides (usually called function_app.py or something similar) and at the top add the following code below the other imports that you already see. We are also going to import the "os" package, which is a default Python package that will allow us to use environment variables.

 

import logging
import azure.functions as func
# Import DefaultAzureCredentials from the identity package that will be used for authenticatin
from azure.identity import DefaultAzureCredential
# Import of SecurityInsights that allows to write to the Watclist API
from azure.mgmt.securityinsight import SecurityInsights
# Import the Watchlist model that allows us to create a watchlist
from azure.mgmt.securityinsight.models import Watchlist
# Import the service client to read from our storage account
from azure.storage.blob import BlobServiceClient
# Import the log ingestion client to write to the Log Analytics API for creating custom logs
from azure.monitor.ingestion import LogsIngestionClient
import csv
from io import StringIO
# Import os to read environment variables
import os
# Import HttpResponseError to handle errors from the Log Ingestion API
from azure.core.exceptions import HttpResponseError

 

This code imports specific modules from our packages and it is best practice to import what you need rather than the entire package. Let's start adding some code.

Authenticating to the storage account

All the libraries we will use (Blob, SecurityInsights and LogIngestion) will require authentication. This is done with OAUTH2, however by using the Azure Identity library this is simplified immensely for us. The file that contains your code also includes the decorator for your timer function which might look like this

import logging
import azure.functions as func
# Import DefaultAzureCredentials from the identity package that will be used for authenticatin
from azure.identity import DefaultAzureCredential
# Import of SecurityInsights that allows to write to the Watclist API
from azure.mgmt.securityinsight import SecurityInsights
# Import the Watchlist model that allows us to create a watchlist
from azure.mgmt.securityinsight.models import Watchlist
# Import the service client to read from our storage account
from azure.storage.blob import BlobServiceClient
# Import the log ingestion client to write to the Log Analytics API for creating custom logs
from azure.monitor.ingestion import LogsIngestionClient
import csv
from io import StringIO
# Import os to read environment variables
import os
# Import HttpResponseError to handle errors from the Log Ingestion API
from azure.core.exceptions import HttpResponseError


app = func.FunctionApp() 

@app.timer_trigger(schedule="0 0 2 * * *", arg_name="myTimer", run_on_startup=False, use_monitor=False) 
def update_watchlist(myTimer: func.TimerRequest) -> None: 
    if myTimer.past_due: 
        logging.info('The timer is past due!') 
        logging.info('Python timer trigger function executed.')

 

Above this function (and decorators), add a simple line to fetch the credentials to use for our libraries like so:

# rest of your code 

# This line is already there
app = func.FunctionApp() 

credential = DefaultAzureCredential() 

# rest of your code

 

The DefaultAzureCredential() class from the Azure Identity library simplifies authentication by automatically selecting the most appropriate credential based on the environment. It supports multiple authentication methods, including managed identity, environment variables, and Visual Studio credentials—making it ideal for both local development and production. For a full list of supported options, refer to the documentation. Once deployed, the Azure Function's managed identity that we created earlier will be used for authentication and authorization.

I also recommend logging in locally with Azure CLI on your machine as that is one of the places DefaultAzureCredential() looks for a token. This is ideal for local testing and will allow you to skip over creating an app registration in Entra. This will use the account you have locally logged on with to authenticate. The caveat is that you might have more permissions than the Managed Identity, so if your code works locally that's not a guarantee it will work in the Function.

Reading from the Storage Account

Next up, we will add a function that will read the content of the CSV file. For this, we need the filename, the storage account name, container in which the blob is stored and a set of credentials. We will pass these all as parameters. Somewhere in your file, paste the code below that achieves this.

def read_watchlist_from_storage(file_name, storage_account_name, container_name, credential):
    """
    Read data from a storage account container.
    
    :param file_name: Name of the file to read
    :param storage_account_name: Name of the storage account
    :param container_name: Name of the container in which the file is stored
    :param credential: Azure credentials for authentication
    :return: Content of the file as a string
    """
    if not file_name.endswith('.csv'):
        raise ValueError("File must be a CSV")

    blob_service_client = BlobServiceClient(account_url=f"https://{storage_account_name}.blob.core.windows.net", credential=credential)
    blob_client = blob_service_client.get_blob_client(container=container_name, blob=file_name)
    
    download_stream = blob_client.download_blob()
    file_content = download_stream.readall().decode('utf-8')

    logging.info("Read from Storage account:" + file_content[0:100])
    
    needsBatching = False
    
    # TODO: Check if file exists and is not empty
    
    if blob_client.get_blob_properties().get('size') >=  3984588:
        needsBatching = True 
    
    return file_content, needsBatching

 

This function does a simple check whether the provided file name is a CSV file (lines 11-12). It will create a BlobServiceClient connecting to the storage account using the token from DefaultAzureCredential() which we fetched earlier and get a client to handle blobs (lines 14-15). Finally, it will download the file and read its content (lines 17-18). 

On lines 22-28 we check how large the size of the file is. This is because the watchlist API does not allow us to send content that is larger than 3.8 MB, so if the file is larger than that in bytes, we will also let the caller know that batching might be required by returning a Boolean value next to the file content as a piece of string (line 29). Note that this code leaves some room for improvement, we do not do any safety checks on the file content or size if it's empty. We also write some of the data of the file to the logging mechanism, which could be a potential data leak issue, remove line 20 in production, but for local testing this is nice to have.

Parsing String to CSV

The data that we have read from our blob storage is just a large string. To manage this more efficiently, we are going to add a small helper function to convert the data to CSV in Python. This allows for easy batching and reading. We will be using the CSV and StringIO library for this, these are built-in Python libraries. Add these two lines where you added your libraries initially as well

 

# Import for CSV manipulation
import csv 
from io import StringIO

Then, add this function somewhere in your file.

def parse_csv_string(csv_string):
    csv_data = []
    csv_reader = csv.reader(StringIO(csv_string))
    for row in csv_reader:
        csv_data.append(row)
    return csv_data

 

This does nothing more than read the data that is passed as a string, parsing it into a CSV object and adding the content of it to an array.

Updating your watchlist

Next, we are going to add a function that allows you to create or update a watchlist based on the items you provide. We will need some additional values such as the watchlist name, description, search key, subscription, workspace name, resource group, and a set of credentials. We will pass these as parameters as well.

 

def update_watchlist_sentinel(watchlist_name, watchlist_items, description, resource_group_name, workspace_name, search_key, provider, credential, subscription_id):
    """
    Update a watchlist in Microsoft Sentinel.
    
    :param watchlist_name: Name of the watchlist to update
    :param watchlist_items: List of items to add to the watchlist
    :param resource_group_name: Name of the resource group where the workspace is located
    :param workspace_name: Name of the workspace where the watchlist is located
    :param search_key: Key to search for in the watchlist items
    :param credential: Azure credentials for authentication
    :param subscription_id: Azure subscription ID
    """
    client = SecurityInsights(credential, subscription_id)

    watchlist = Watchlist()
    watchlist.display_name = watchlist_name 
    watchlist.items_search_key = search_key
    watchlist.provider = provider
    watchlist.source = "Local file"
    watchlist.raw_content = watchlist_items
    watchlist.number_of_lines_to_skip = 0
    watchlist.content_type = "text/csv"
    watchlist.description = description
    watchlist.watchlist_alias = watchlist_name
    client.watchlists.create_or_update(resource_group_name, workspace_name, watchlist_name, watchlist)

 

Here we create a SecurityInsights client using the credentials from the DefaultAzureCredential() and the subscription ID. We then create a watchlist with the provided values (lines 15-24) and then call the function to create or update a watchlist. Line 21 might be important, if your file contains some initial content that needs to be adjusted before your header row, then you might want to adjust that number. Line 18 is another one that you can adjust to your liking.

Now we have the building blocks to read from our Storage Account and to update a watchlist, we just need to stitch them together in our timer trigger function.

Piecing the puzzle together

The timer function that was initially created by VS Code as boiler plate can now be adjusted to call the helper functions we have created above.

@app.timer_trigger(schedule="0 0 2 * * *", arg_name="myTimer", run_on_startup=False,
              use_monitor=False) 
def update_watchlist(myTimer: func.TimerRequest) -> None:

# Initialize variables from environment variables
    watchlist_name = os.getenv("WATCHLIST_NAME")
    subscription_id = os.getenv("AZURE_SUBSCRIPTION_ID")
    resource_group_name = os.getenv("RESOURCE_GROUP_NAME")
    workspace_name = os.getenv("WORKSPACE_NAME")
    file_name = os.getenv("FILE_NAME")
    storage_account_name = os.getenv("STORAGE_ACCOUNT_NAME")
    container_name = os.getenv("STORAGE_CONTAINER_NAME")
    provider = os.getenv("WATCHLIST_PROVIDER")
    search_key = os.getenv("WATCHLIST_SEARCH_KEY")
    description = os.getenv("WATCHLIST_DESCRIPTION")


    #If you want to use a custom table, please set the following variables in your environment
    # dce_url = os.getenv("DCE_URL")
    # rule_id = os.getenv("DCR_RULE_ID")
    # stream_name = os.getenv("STREAM_NAME")
    
    # Read watchlist items from storage
    watchlist_content, needsBatching = read_watchlist_from_storage(file_name, storage_account_name, container_name, credential)
    
    if needsBatching:
        batch_size = 800
        parsed_data = parse_csv_string(watchlist_content)
        for i in range(0, len(parsed_data), batch_size):
            batch = parsed_data[i+1:i + batch_size]
            csv_content = ",".join(parsed_data[0]) + "\n"
            for row in batch:
                csv_content += ",".join(row) + "\n"
            update_watchlist_sentinel(watchlist_name, csv_content, description, resource_group_name, workspace_name, search_key, provider,  credential, subscription_id)
            # update_custom_table(csv_content, dce_url, rule_id, stream_name, credential)
    else: 
        # Update the watchlist in Microsoft Sentinel
        update_watchlist_sentinel(watchlist_name, watchlist_content, description, resource_group_name, workspace_name, search_key, provider, credential, subscription_id)
        # update_custom_table(parsed_data, dce_url, rule_id, stream_name, credential)

On line 6-15 we fetch the parameters we need in our functions. Lines 19-21 are needed if we are going to use custom tables (see next section).

Using environment variables is a great way to add both security and flexibility into your code. There is no plain-text information available if your code ever gets leaked and additionally, you will have the ability to change the parameters without having to re-deploy the code as the variables are read every time the function is executed.

On line 24  we read the file content and whether we need to perform batching or not. If we need to batch, we do so on lines 27-35. In these lines we set a batch size, we parse the CSV so we can easily loop over them and start from the second row (as the first row is the header) and given every batch we send to the API should have the header, we include it in every batch (line 31). Finally, the data is uploaded using our helper function on line 30. If batching is not required, we can immediately go to line 38 where we send the entire data content. Lines 35 and 39 are if we are going to use custom tables (see next section).

If you would like to use custom logs rather than a watchlist, please go to the next section, otherwise, please scroll down to the Testing section.

Using Custom logs

When we use the above function, the API does not do any tracking or versioning. If the watchlist doesn't exist, it creates it and then adds the rows. If it does, it updates the watchlist. This means that if you change rows in CSV rather than just adding new ones, the old rows are still there and you will get duplicates.

 

You can solve this by adding your own custom code to track the difference between versions of this CSV, but this approach adds complexity to your setup. Another option could be to leverage the Log Ingestion API and ingest our watchlist data into a custom table, which is what we will do in this section.

 

The code for this to work does assumes that:

  1. You have already created the custom table
  2. You have the Data Collection Endpoint URL and Data Collection Rule immutable id from when you created the table

 

We are going to add a new function that takes the CSV data, converts it into JSON and posts the data to the log ingestion API as this API does expect JSON data and not CSV. For this piece of code also add this import on top which will allow us to catch any errors the API throws.

# Import HttpResponseError to handle errors from the Log Ingestion API 
from azure.core.exceptions import HttpResponseError

Then paste this function somewhere in you file

def csv_to_json(csv_string):
    """
    Convert CSV data to JSON format.
    
    :param csv_string: CSV data as a string
    :return: JSON data as a string
    """
    csv_reader = csv.DictReader(csv_string.splitlines())
    json_data = [row for row in csv_reader]
    return json_data

def update_custom_table(csv_data, dce, rule_id, stream_name, credential):
    """
    Upload CSV data to a table in Microsoft Sentinel. Required permissons: Monitoring Metrics Publisher for identity.

    :param csv_data: CSV data as a string
    :param dce: Data collection endpoint URL    
    :param rule_id: Rule ID for the DCR
    :param stream_name: Name of the stream to upload to in your DCR
    :param credential: Azure credentials for authentication
    :return: None
    """

    client = LogsIngestionClient(
        endpoint=dce, credential=credential, logging_enable=True
    )

    try:
        res = client.upload(rule_id=rule_id, stream_name=stream_name, logs=csv_to_json(csv_data))
        print(f"Upload succeeded: {res}")
    except HttpResponseError as e:
        print(f"Upload failed: {e}")

 

The first function (lines 1-10) takes a CSV string and converts it to a Python dictionary which is perfect for JSON conversion, we need to convert our data to JSON as the Log Ingestion API works with JSON rather than CSV. The second function (lines 12-32) takes the CSV data, your data collection endpoint URL, data collection rule ID, stream name and credentials and creates a client (line 24) and tries to upload the data.

 

To make our timer function from the previous section work with this function rather than the watchlist API, we just need to comment the lines that call update_watchlist_sentinel (line 34 and 38) and uncomment the lines that call update_custom_table (line 35 and 39). Don't forget to uncomment also lines 19-21 that fetch the parameters from the environment for your DCE, DCR and stream name.

If you are going to update the file to add or remove values frequently, it would be good practice to update it once per day. That way, you can look back at the last version of your watchlist and ensure you are ignoring deleted values.

Usage 

Once you have your watchlist in a custom logs table, you should filter by your last update, this way if you add new items or delete existing items, this will be taken into consideration. For instance, if you run your function every day at 1am, you could run this query: 

 

[yourtablename] | where ingestion_time() >= startofday(now()) + 1h

 

Testing

It's now time to test our code, we can do this first locally. Press F5 to start debugging your code. When you do, VS Code will automatically try to activate a virtual environment, install the requirements (which we already did in the previous step) and then run the function locally. This should result in you seeing an output like this in the terminal pane on the bottom.

Please note that for local testing you will need Azure Core Tools (as stated in the prerequisites) and a virtual environment (Create a Python function from the command line - Azure Functions).

 

 

 

As we have a timer trigger, the function will not be run automatically. There is an option that would allow the function to run at startup, but it is by default set to false, so we either wait for our timer to trigger, or we run it manually. To run it manually, open your code and set a debug point on the first line of your code by clicking on the line number.

 

 

Then, navigate to the Azure Functions Extension pane, at the bottom expand your local project and its function, you should be seeing your function name, right click it and press "Execute Function Now". Once you do, this will trigger your function, and you should see the debugger stop at your first line where we set the debug point. Press F10 (or your shortcut) to step over the first line

Press F10 (or your shortcut) to step over the first line. You will see the evaluation of the code and already a first issue. The watchlist_name is initialized as None, which means that the environment variable is not populated. This is because the variables that you set in the first step are only populated in the Azure Function, not locally.

 

 

To fix this, open a file that has been generated when you created the function locally called "local.settings.json". Here, under "Values", create your variables under the ones that are already there.

 

    {
        "IsEncrypted": false,
        "Values": {
            "AzureWebJobsStorage": "",
            "FUNCTIONS_WORKER_RUNTIME": "python",
            "WATCHLIST_NAME": "example-watchlist",
            "AZURE_SUBSCRIPTION_ID": "00000000-0000-0000-0000-000000000000",
            "RESOURCE_GROUP_NAME": "example-resource-group",
            "WORKSPACE_NAME": "example-workspace",
            "FILE_NAME": "example.csv",
            "STORAGE_ACCOUNT_NAME": "examplestorageacct",
            "STORAGE_CONTAINER_NAME": "example-container",
            "WATCHLIST_SEARCH_KEY": "search-key",
            "WATCHLIST_PROVIDER": "provider-name",
            "WATCHLIST_DESCRIPTION": "watchlist-description"
        }
    }

 

Don't forget to also populate DCE, DCR and STREAM_NAME if you are going for the custom tables option. Then restart the debugging. Now you should see the value of your variable.

You can now continue to run the code and debug where necessary.

Deploy

Assuming everything is working as expected, we are ready to deploy our code to Azure. For this, open the Azure Functions extension pane, in the bottom pane right click your local project and click "Deploy to Azure...." or the Deploy icon.

 

 

This will then allow you to follow the steps to deploy your code to the Azure Function that we have created earlier on.

 

 

Select your app and you should be able to see in the bottom pane how the deployment is proceeding

 

 

 

Now, your function will automatically trigger based on your CRON expression and update the watchlist and/or custom table. Now, all you need to do is to update your list in the storage account. If you want to see your function and its invocations, navigate to your Azure Function, you should be able to see your Function name in the overview page and a link that will take you there. You can then see any failures and reason.

 

Summary

 

In our blog post, we provided a comprehensive guide on how we can automate updating watchlists in Microsoft Sentinel using Azure Functions and Microsoft Sentinel APIs. We discussed the limitations of manual updates via CSV files and Azure Storage and introduced our method to overcome these limitations by leveraging Azure Functions to automatically refresh watchlist data securely and more frequently.

We also provided you with a way to use custom log tables rather than watchlists if watchlists do not fit the purpose or you have more complex scenarios.

The entire code can be found on Github.

Updated Jul 14, 2025
Version 1.0
No CommentsBe the first to comment