Sentine Playbooks and Darktrace

Copper Contributor

I don't suppose anyone's managed to successfully create a playbook that makes POST requests to a DarkTrace cloud master appliance and willing to share some pointers?


I'm trying to create a simple playbook to tag SaaS users but I'll be a monkey's uncle if I can figure out the correct syntax to use.


The docs say POST requests to this endpoint must be made with parameters. JSON is not supported and provide an example of: https://[instance]/tags/entities -d tag=Active Threat&did=1&duration=3600


Running this returns "INTERNAL ERROR" which isn't particularly useful.

Support have said the docs aren't quite right and that a POST URL should be /tags/entities?tag=TagName&did=1. This whole string then gets encoded in the signature then, along with this URL, and we also need to send the form data as a body {'tag': TagName, 'did': 1}


The latter body being invalid JSON which the HTTP connector won't allow so even fudging things a bit to get the right format just results in "API INVALID SIGNATURE".



5 Replies


Have you tried to POST

and no body? What support have given you should also work, but I don't understand why they want you to repeat the same parameters:values in a body? Seems excessive to me.

What are you putting in headers? DTAPI-Token, DTAPI-Date and DTAPI-Signature seem to be required for auth.




Thanks for the reply. Yeah all the headers are sent through accordingly and if we run this in python works absolutely fine. I agree that sending the payload as parameters AND as a body is redundant but...*shrug*


import requests
from datetime import datetime
import hmac
import hashlib
import json

#ssl verification disabled

def post_api(url, args):

    server = 'https://' # appliance URL
    public = '' # public token
    private = '' # private token
    date = datetime.utcnow().strftime('%Y%m%dT%H%M%S')
    sig ='ASCII'),(url +'\n'+ public +'\n'+ date).encode('ASCII'), hashlib.sha1).hexdigest()
    headers = {'DTAPI-Token': public, 'DTAPI-Date': date, 'DTAPI-Signature': sig, 'Accept': '*/*'}

    r = + url, data=args, headers=headers, verify=False)
        resp = json.loads(r.text)
        return resp
            resp = json.load(r)
            return resp
            return "INVALID API RESPONSE"

result = post_api('/tags/entities?tag=TestTag&did=131105', {'tag': 'TestTag', 'did': 131105})


In the Python code we send the {'tag': 'TestTag', 'did': 131105} as a function arg which is submitted as a body but trying this in LogicApps just fails.

    "uri": "https://[redacted]/tags/entities?tag=TestTag&did=131105",
    "method": "POST",
    "headers": {
        "Accept": "*/*",
        "DTAPI-Date": "[redacted]",
        "DTAPI-Signature": "[redacted]",
        "DTAPI-Token": "[redacted]"
    "body": "{'tag': 'TestTag', 'did': 131105}"

I had to do some fudging to get the body in the same format as the HTTP connector fails to valid the body as valid JSON (which is accurate). Putting it in proper JSON format escapes the string to something like {\"tag\": \"TestTag\",\"did\":131105} and is unsuccessful.


I suspect the HTTP connector is mangling something somewhere so it's likely a case of finding the combination of body format, content-types, etc to send to the API endpoint. I also suspect I've spent too long looking at this and missing something obvious.

Not sure if this will make any difference, but maybe try adding these 2 headers:
"Accept": "application/json"
"Content-type": "application/json"

As for the body being reformatted with escape chars, you could try adding a Compose action before the HTTP request (unless it's a trigger action!) with the content of {"tag": "TestTag","did":131105} and then using this actions' output as your HTTP request body, if that makes sense! :)



Yeah I've tried quite a few combinations of accepts and application types to no avail so far. The problem with is that the HTTP is a JSON body so as soon as you try an encapsulation with {} it won't even allow a save due to validation errors. 


The output I listed below is where I forced it in to that format using a prior compose task.


Honestly starting to think about cutting my losses and having it as a function app with HTTP trigger and just using the logic app as an orchestrator.


I've already built a function app that the logic app calls to perform the signature signing request so almost feels my method might be a little off. I wonder if the better solution would be just to have the function app make the API calls and just return the results to the playbook to return to Sentinel. That way I'd only need to pass the url and args to the function app.


If you already have a working function app that is reliably working on a set schedule, then I wouldn't try reinventing a wheel here. Yes, it would defo be neat to have the whole thing in one place, but considering the effort, it's a waste of time in my mind.