Microsoft Secure Tech Accelerator
Apr 03 2024, 07:00 AM - 11:00 AM (PDT)
Microsoft Tech Community
Office 365 Email Activity and Data Exfiltration Detection
Published Feb 13 2020 06:04 AM 78.4K Views
Microsoft

This article shows how to use Office 365 message trace to analyze email activity and detect various security use cases like data exfiltration in Azure Sentinel.

 

Office 365 Message Trace contains lots of information that can be useful for security analyst. While it doesn’t include message content itself, it can provide interesting information about mail flow in the organization. It can be also used to detect malicious activity and generate interesting reports about mail-flow (e.g information about bulk mail, spoofed domain emails or detecting abnormal rate of e-mail sending). Especially abnormal rate of e-mail sending can be used to detect malicious data exfiltration from within the organization. In this article we will describe how we can use Office 365 Message Trace and Azure Sentinel to detect these security scenarios.

 

Update 3rd June 2020 - while this article is using Logic Apps to ingest message trace data, you can consider using another, perhaps more elegant approach to ingest O365 message trace data based on Azure Function. For more details visit article published by my colleague @Jon Nordström in Ingesting Office 365 Message Traces to Sentinel

 

Accessing Office 365 Message Trace 

Office 365 Message Tracking logs can be accessed directly through web interface in Security & Compliance Center or Powershell (via Get-MessageTrace cmdlet). Additionally for programmatic access there’s also Office 365 Message Trace Reporting Web Service – we will be using this service in the article. It can be accessed through REST URI at https://reports.office365.com/ecp/reportingwebservice/reporting.svc/MessageTrace?. By default, it returns 30 days of message trace data. To filter results you can provide additional parameters in the URI – e.g. as in below example where we are looking for data within 2 days timeframe. Also note, that if you provide StartDate you also need to provide EndDate.

https://reports.office365.com/ecp/reportingwebservice/reporting.svc/MessageTrace?\$filter=StartDate%...'

Office 365 Message Trace can be queried in the web interface for up to 30 days of data. If the reporting service is queried for longer period than 30 days, it will return empty dataset. Also, while all data about messages is available as soon as they are sent or received, it can take up to 24 hours until they are available through reporting service.

 

Creating Service Account

Before accessing Office 365 Message Trace service we need to create Office 365 service account. This account needs to have very strong password (as there’s no OAuth 2.0).

Service account can be created in Office 365 Security & Compliance Center or with Powershell. In order to manage Office 365 with PowerShell module, you need to follow steps in Connect to Office 365 Powershell.

 

Here’s the cmdlet to create the service user:

 

$TenantDomain = (Get-MsolAccountSku).AccountSkuId[0].Split(":")[0] + ".onmicrosoft.com"
$UserName = "msgtracereporting@"+$tenantdomain
$Pwd = "O365Msg-TracE"
New-MsolUser -UserPrincipalName $UserName -DisplayName "Message Trace Reporting" -Password $Pwd -ForceChangePassword $False -PasswordNeverExpires $True -UsageLocation "NL"
$RoleGroup = New-RoleGroup -Name "Message Trace Reporting" -Roles "Message Tracking", "View-Only Audit Logs", "View-Only Configuration", "View-Only Recipients" -Members $UserName

 

Note: If you are facing issue with New-RoleGroup command, please be sure you are connected to Exchange Online Powershell as described here - https://docs.microsoft.com/en-us/powershell/exchange/exchange-online/exchange-online-powershell-v2/e...

 

Once you have the service account created, you can test the service by running simple curl command:

curl -v --user msgtracereporting@tenantdomain:password "https://reports.office365.com/ecp/reportingwebservice/reporting.svc/MessageTrace?\$filter=StartDate%...'"

 

By default, Office 365 Reporting Service will return XML dataset, but you can change the resultset to JSON by specifying it in the Header request. We will be working with JSON dataset as Azure Sentinel works by default with JSON. Also, Logic Apps has better support for JSON than for XML. To get JSON data just include -H "Accept: application/json" in the curl command.

 

Creating Logic Apps playbook

We will be retrieving and ingesting data into Sentinel through Logic Apps playbook.

 

Note: there are other ways how message trace data can be ingested – e.g. through using Logstash, custom function or creating scheduled job that will ingest data through Sentinel HTTP Data Collection API.

 

Now, let's go through Logic App Playbook creation. First, create new playbook in the Azure Sentinel Playbooks section, chose resource group and location.

 

Next, we need to choose the playbook trigger. We will be using Logic Apps scheduled trigger. We will set to run the trigger in daily interval, but you can chose any period, just remember the maximum 30 days interval to get the message trace. 

 

Message Trace table data will be ingested into EmailEvents custom logs table (EmailEvents_CL). We will be referencing this table thorough the article.

 

As playbook will be running in scheduled interval, we need to address what is the data interval to be queried during each playbook execution. The simple approach would be to take always the period of playbook execution – i.e. if we know the playbook runs every 24 hours, we would always request the 24 hours old data from message trace – interval of <(now()-1d), now()>. But this approach doesn’t provide the most flexible approach – i.e. if we decide to change the playbook interval (e.g. to 48 hours), we also need to update code of data retrieval in playbook itself. Also, if we will be doing any troubleshooting and we will need to rerun playbook, we can end up in having duplicate data ingested into the message trace table. And as Azure Sentinel doesn’t provide option how to delete data (it’s a SIEM), we need to be careful on how we are doing data ingestion.

 

The more accurate approach for data retrieval is based on timestamp of the latest ingested message trace, and querying data from this timestamp . In order to avoid empty dataset in potentially rare situation when the latest message is older than 30 days (as mentioned the reporting service will return data only within 30 days timeframe), we will query from interval as function of min(latest_ingested_message _timestamp, 30 days).

 

To retrieve timestamp of the latest ingested message, we will run the following KQL query:

EmailEvents | summarize arg_max(Received_t, Received_t) | project Received_t

 

Note: we are using arg_max function, returning only the largest value, and then projecting against this value to get single result.

Now we get the min(latest_ingested_message _timestamp, 30 days) and run fuzzy logic with isfuzzy = true operator to ensure the query won’t fail if the table doesn’t exist yet. As we are using isfuzzy=true, this query will also succeed when the EmailEvents table is not yet created (first Playbook execution)

The final query:

 

union isfuzzy=true
(print Received_t=(now()-30d)), //querying max 30 days ago
(EmailEvents_CL | summarize arg_max(Received_t, *)) //latest message
| summarize max(Received_t)
| project max_Received_t = (max_Received_t + 1ms)

 

We will now add into playbook Run Query and list Results action to execute the query:

 

Calling Office 365 Message Reporting Service

After we have the timestamp of the latest message, we can now call the Office 365 Message Trace Service.

First, we need to parse the result of query execution in previous step. We will be using Parse Json action with default schema generated from the return value of query function. We just changed type from array to object under items property. As we know we are querying for single value, we can conveniently change the type to object, which will return single item rather than array with one item.

parse_json2.png

To call the O365 Message Trace Reporting Service we will use HTTP function in Logic Apps. We will be also adding JSON into Headers section to retrieve data in JSON format instead of XML:

httpcall.png

 

Also notice the expression we added for the most recent timestamp we queried in previous step and utcnow() function to refer to current data.

 

After we retrieve message trace data, we will ingest them into Azure Sentinel. Before we ingest data, we parse the result set from HTTP service query against O365 reporting service using another Parse_JSON function:

parse_json.png

For data ingestion we will be using Send Data function from Log Analytics function list. Note that by default this function will produce for-each loop if you input array as a parameter. As Send Data function supports ingesting large JSON array at once, we can avoid for-each cycle (also each for-each cycle generates additional logic app cost), and ingest all retrieved messages at once. To do so just add “value” request into SendData action. You may not see “value” immediately in the list of dynamic properties – if in this case just type into expression dialog body(‘Parse_JSON’)?[‘value’].

 

Important note: Send Data function has currently 30MB limit for data ingestion, so in case your playbook fails due to large data set, you can increase the playbook recurrence interval. Additionally, you can implement a logic that will check for message trace size, and if it's above 30MB you can send alert (e.g. through email action). You can check for size through using length function (@length(string(variables('value'))) or checking Content-Length header from response. Both calculations may be approximate due to encoding/stringification but should be accurate enough for this purpose.

 

send_data.png

And here’s the resultset after Message Trace ingestion into EmailEvents table:

resultset.png

 

Detecting Data Exfiltration

After we have ingested data from Office 365 Message Trace into Azure Sentinel, we can start do querying and preparing security use cases. One of the common use case across organization is to detect data exfiltration. One indicator of data exfiltration is sending large amount of data in a short timeframe. 

 

Note: in following queries please replace article's tenant name m365x175748.onmicrosoft.com with your Office 365 domain/tenant name. If you are using multiple domain names, for each of the domain add additional operator into the query.

 

To detect data exfiltration, we will form KQL query –

 

First, we will create query that will calculate baseline for #of sent messages:

let sending_threshold = toscalar(
EmailEvents_CL
| where Received_t >= startofday(ago(7d)) and Received_t < startofday(now())
| summarize cnt=count() by SenderAddress_s, bin(Received_t, 1d)
| summarize avg(cnt), stdev(cnt)
| project threshold = avg_cnt+stdev_cnt);
print sending_threshold

 

After sending_threshold is calculated, we can now form full query that will check for specific deviations from the threshold. For more details how this query was formulated check one of the recent Azure Sentinel webinar on rules creation at https://aka.ms/SecurityWebinars.

 

let sending_threshold = toscalar(
EmailEvents_CL
| where Received_t >= startofday(ago(7d)) and Received_t < startofday(now()) and RecipientAddress_s !endswith "m365x175748.onmicrosoft.com"
| summarize cnt=count() by SenderAddress_s, bin(Received_t, 1d)
| summarize avg(cnt), stdev(cnt)
| project threshold = avg_cnt+stdev_cnt);
EmailEvents_CL
| where Received_t >= ago(1d)
| summarize count() by SenderAddress_s
| where count_ > sending_threshold

 

Once we have the query, we can create Sentinel alert rule and start being alerted about anomalous data exfiltration.

Additional information from Office 365 Message Trace

Top 10 senders by message count:

EmailEvents_CL
| summarize Amount=count() by SenderAddress_s
| top 10 by Amount

 

Top 10 recipients by message count:

EmailEvents_CL
| summarize Amount=count() by RecipientAddress_s
| top 10 by Amount

 

Mail Flow over time:

EmailEvents_CL
| summarize count() by bin(Received_t, 30m)
| render timechart

 

Summary of internal/external inbound vs. outbound email:

EmailEvents_CL
| summarize InternalEmail = countif(SenderAddress_s endswith "m365x175748.onmicrosoft.com" and RecipientAddress_s endswith "m365x175748.onmicrosoft.com" ), OutboundEmail = countif(SenderAddress_s endswith "m365x175748.onmicrosoft.com" and RecipientAddress_s !endswith "m365x175748.onmicrosoft.com" ), InboundEmail= countif(SenderAddress_s !endswith "m365x175748.onmicrosoft.com" and RecipientAddress_s endswith "m365x175748.onmicrosoft.com" ) by bin_at(Received_t, 1h, now())
| render timechart

 

Top 10 largest email messages by message size:

EmailEvents_CL
| top 10 by Size_d

 

Also, we can use Message Trace data to check if organization has received any e-mail from domain-like email address (e.g. contoso.com vs c0nt0so.om). This domain impersonation can be indicator of phishing attack. One of the option how to do it is to use the tool like dnstwist (there’s also online version at https://dnstwister.report/) to generate list of valid and possible permutations of your domain, store it as a lookup table and then use it in the query joining the data from the EmailEvents table (more about how to use lookup table with Azure Sentinel).

 

We have also created sample workbook for security analysts based on queries described above:

workbook.png

 

Summary

This article has demonstrated how to ingest Office 365 Message Trace logs into Sentinel. Office 365 Message Trace provides underlying data for various interesting security scenarios and use cases like data exfiltration. We have uploaded JSON code and screenshot from playbook into GitHub. Apologies for low screenshot quality - but it should be enough to understand the playbook concept. JSON code provides schema, you just need to replace two function - Run Query and List Results and Send Data as described in the article.

 

Here's also the final Logic Apps playbook for reference:

playbook.png

27 Comments
Copper Contributor

Thanks for sharing.

I don't find Message Tracking role, could you please check and suggest alternate relevant role?

Copper Contributor

as i did not find Message Tracking role, I tried following role for the group,

View-Only Manage Alerts
Organization Configuration
View-Only Audit Logs
View-Only Record Management
View-Only Recipients

 

created a playbook. but getting following error 404 for HTTP step.

"value": "Resource not found for the segment 'MessageTrace'."

 

could you please help?

Microsoft

Hi @Mahesh0212 thanks for reaching out, you need message tracking role to access message trace, that's also why the HTTP step is failing. What errors are you getting when trying to create the service account? The message tracking role is available as also described here - https://docs.microsoft.com/en-us/exchange/understanding-management-roles-exchange-2013-help. Do you have the right permissions required to create the service account user please?

Copper Contributor

Hi Stefan,

 

thanks for responding. I am having owner rights.

I don't see message tracking role in "https://protection.office.com/permissions", do I need to give the role from anyother place? I tried the command you mentioned as well.

when I run following command-

$RoleGroup = New-RoleGroup -Name "Message Trace Reporting" -Roles "Message Tracking", "View-Only Audit Logs", "View-Only Configuration", "View-Only Recipients" -Members $UserName

received error as,

New-RoleGroup : The term 'New-RoleGroup' is not recognized as the name of a cmdlet, function, script file, or operable
program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
At line:1 char:14
+ $RoleGroup = New-RoleGroup -Name "New Message Trace Reporting" -Roles ...
+ ~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (New-RoleGroup:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException

 

so for time being, I am trying my account instead of service account.

 

once the HTTPS output is available I am trying to parse json and following schema comes by uploading output. but when I run it, it gives error

"ValidationFailed. The schema validation failed."

could you please share what schema you used?

schema after uploading output from http body.

{
"type": "object",
"properties": {
"odata.metadata": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"Organization": {
"type": "string"
},
"MessageId": {
"type": "string"
},
"Received": {
"type": "string"
},
"SenderAddress": {
"type": "string"
},
"RecipientAddress": {
"type": "string"
},
"Subject": {
"type": "string"
},
"Status": {
"type": "string"
},
"ToIP": {},
"FromIP": {
"type": "string"
},
"Size": {
"type": "integer"
},
"MessageTraceId": {
"type": "string"
},
"StartDate": {
"type": "string"
},
"EndDate": {
"type": "string"
},
"Index": {
"type": "integer"
}
},
"required": [
"Organization",
"MessageId",
"Received",
"SenderAddress",
"RecipientAddress",
"Subject",
"Status",
"ToIP",
"FromIP",
"Size",
"MessageTraceId",
"StartDate",
"EndDate",
"Index"
]
}
}
}
}

thanks,

mahesh

Microsoft

Hi @Mahesh0212 you need to follow-steps described in Connect to Office 365 with Powershell , that's references in the article. Specifically you need to import Ad module and ms online module and initiate session to get access to Exchange Online powershell modules. 

 

As per the schema from HTTPS call, can you please try the following one:

{
    "properties": {
        "odata.metadata": {
            "type""string"
        },
        "value": {
            "items": {
                "properties": {
                    "EndDate": {
                        "type""string"
                    },
                    "FromIP": {
                        "type""string"
                    },
                    "Index": {
                        "type""integer"
                    },
                    "MessageId": {
                        "type""string"
                    },
                    "MessageTraceId": {
                        "type""string"
                    },
                    "Organization": {
                        "type""string"
                    },
                    "Received": {
                        "type""string"
                    },
                    "RecipientAddress": {
                        "type""string"
                    },
                    "SenderAddress": {
                        "type""string"
                    },
                    "Size": {
                        "type""integer"
                    },
                    "StartDate": {
                        "type""string"
                    },
                    "Status": {
                        "type""string"
                    },
                    "Subject": {
                        "type""string"
                    },
                    "ToIP": {
                        "type": [
                            "string",
                            "null"
                        ]
                    }
                },
                "required": [
                    "Organization",
                    "MessageId"
                ],
                "type""object"
            },
            "type""array"
        }
    },
    "type""object"
}
Copper Contributor

Great article @Stefan Simon . One question. I'm not seeing any Schema info under the Parse JSON 2 action. The article seems to indicates that this is auto generated... "default schema generated from the return value of query function". I should point out here that I don't have a Custom Log created yet so there is likely no return data coming from action two (Run query and list results) just yet. The article seems to assume that the custom log (EmailEvents_CL) should already be created. However, I'm having trouble finding info on how to create a custom log when not using the default method as described in the article below (dependent on csv files that are picked up by the CL watcher service regularly). How would one create a CL when using a Data Collector api, for example. Thanks.

 https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-custom-logs#defining-a-cu...

 

 

Copper Contributor

@Stefan Simon if you can also provide an exploded view of the complete playbook (like the image you have in the summary section but expanded out) it will really help someone like me who is new to Azure Sentinel Playbooks. Thanks :)

Microsoft

Hi @KenSilver , thanks for your interest in the article. I have uploaded JSON code from playbook into GitHub as well as Playbook screenshot - https://github.com/stefans-cyber/sentinel. Apologize for low screenshot quality, but I wasn't able to make it better - should be enough though to understand the concept. JSON code provides you schema, you just need to replace two function - Run Query and List Results and Send Data as described in the article.

Copper Contributor

Thanks @Stefan Simon for updating the images and json code. I'm able to progress through the http action but it now errors on the last Parse Json action. The error is the same as @Mahesh0212  "ValidationFailed: The schema validation failed." When I click "show raw outputs" the error message is "Invalid type. Expected String but got Null.". This error occurs on every value in the schema (ie Subject, MessageID, etc). The Parse Json action seems to have received the body from the http action. Under INPUTS in Parse JSON, I can "click to download" and it brings up the trace data in a new browser window. I'll continue to troubleshoot. Thanks. 

Microsoft

Thanks @KenSilver what schema are you using in Parse_JSON? Can you verify it's the same schema as below please?

{
    "properties": {
        "odata.metadata": {
            "type""string"
        },
        "value": {
            "items": {
                "properties": {
                    "EndDate": {
                        "type""string"
                    },
                    "FromIP": {
                        "type""string"
                    },
                    "Index": {
                        "type""integer"
                    },
                    "MessageId": {
                        "type""string"
                    },
                    "MessageTraceId": {
                        "type""string"
                    },
                    "Organization": {
                        "type""string"
                    },
                    "Received": {
                        "type""string"
                    },
                    "RecipientAddress": {
                        "type""string"
                    },
                    "SenderAddress": {
                        "type""string"
                    },
                    "Size": {
                        "type""integer"
                    },
                    "StartDate": {
                        "type""string"
                    },
                    "Status": {
                        "type""string"
                    },
                    "Subject": {
                        "type""string"
                    },
                    "ToIP": {
                        "type": [
                            "string",
                            "null"
                        ]
                    }
                },
                "required": [
                    "Organization",
                    "MessageId"
                ],
                "type""object"
            },
            "type""array"
        }
    },
    "type""object"
}
Microsoft

Hi @KenSilver , @Mahesh0212 , I just tried to redo the playbook and all works well. Please check if you have correctly filled HTTP request and that it correctly returns O365 Message Trace in JSON format (you can do it in the Run History of the playbook).

Brass Contributor

Hi @Stefan Simon,

 

Thanks for the template.


Just created the Playbook and noticed that in our case the Json schema didn't work.
It is due to some of the message traces coming with a Null value under the FromIP.

Worth adjusting your template so not just the ToIP but the FromIP can accept Null values:

"FromIP": {
"type": [
"string",
"null"
]
}

 

Working perfectly now.
Microsoft

Thanks @caiodaruizcorrea that's very helpful

Copper Contributor

So in addition to adding a null return type for FromIP per @caiodaruizcorrea I also needed to add a null return type for Subject. After that, all worked successfully. Thanks everyone!

Copper Contributor

Hi @Stefan Simon, Thanks for sharing the schema. I tried it and also tried the correction mentioned by @caiodaruizcorrea . but getting an error "playbook cant be saved as it contains invalid parameters".

 

I tested the same schema over internet against the data we are getting via HTTP, it works perfectly.

 

On the other note, can we use any other authentication instead of Basic? as this is a playbook, I got the service account (without domain) from my client and a password, but it doesn't work.

 

Thanks,

Mahesh

Microsoft

Hi @Mahesh0212 good to see the issue has been resolved now. As per the account requirements, I'm sharing with you details directly from documentation - The account you use to access the reports must have administrative permissions in the Office 365 organization. If the account can view this report in the Office 365 Control Panel, then the account has permissions to retrieve the data from the REST web service. This report requires the user to be assigned to the View-Only Recipients role. In the default Office 365 permissions structure, users with the following administrator permissions can access this report: billing administrator, global administrator, password administrator, service administrator, and user management administrator.

Copper Contributor

Hello @Stefan Simon , thank you for this post.
A customer that we contribute to doesn't see the message tracking role when trying to create the service account.
image.pngOutlook-1re1aiww (1).png

When tried to add the user with the command he got this error.


Does the user require any of the following?
Security Admin
Security Reader
View-Only Recipients
Compliance Admin
Data Loss Prevention
Please let me know how to proceed.

Best Regards,
David Shoshany
Microsoft

Hi @DavidSho , it seems your shell is failing on New-RoleGroup command. Have you properly connected to your Office 365 environment and important all required commands as described in the article? Could you please check there?

@Stefan Simon This is an awesome content, thanks for share, please recommend to follow the step by step on the LogicApp construction. people can find useful to watch this content to learn to interact with KQL queries https://docs.microsoft.com/es-es/azure/azure-monitor/log-query/get-started-queries

Brass Contributor

@Mahesh0212 @DavidSho I ran into the same issues as well with not having the New-RoleGroup cmdlet. You will need to connect using the Exchange Online Powershell Module as well.

 

https://docs.microsoft.com/en-us/powershell/exchange/exchange-online/exchange-online-powershell-v2/e...

Copper Contributor

@Stefan Simon @mperrotta , thanks that solved the issue!

Microsoft

Thanks a lot @mperrotta  I have updated article with this information.

Copper Contributor

@Stefan SimonCan you Share the JSON for the sample workbook you have created?

Microsoft

Dear readers, I would like to share with you another approach how to ingest o365 message trace data with O365 azure function. You can find more details in here - https://github.com/OfficeDev/O365-ActivityFeed-AzureFunction/tree/master/Sentinel/msgtrace

Copper Contributor

@Stefan SimonHi Stephan, many thanks for this great article you have produced.  I'm having a proble with the HTTP action wher it's returning a 400 error - "The query is invalid".  My account is ok as I've tested it with Invole-WebRequest using the URI from the raw input of this action and it returns as 200.  So is does appear that there is an issue with the query.  I've compared the query to your github code and it's identical, I've even copy your entire code and created a new Lofig App and still the same error.

There isn't much to this action and I just can't see what the issue would be, please see below for the raw input and output from the HTTP action:

 

Many Thanks

 

{
"uri": "https://reports.office365.com/ecp/reportingwebservice/reporting.svc/MessageTrace?$filter=StartDate%2...'",
"method": "GET",
"headers": {
"Accept": "application/json",
"Content-Type": "application/json"
},
"authentication": {
"username": "Myaccount@mydomain",
"password": "*sanitized*",
"type": "Basic"
}
}

 

 

 

{
"statusCode": 400,
"headers": {
"request-id": "6332f49f-2fdd-4344-a072-ba2e7a8814b8",
"X-CalculatedBETarget": "CWXP265MB0119.GBRP265.PROD.OUTLOOK.COM",
"X-BackEndHttpStatus": "400",
"X-RUM-Validated": "1",
"X-RWS-Error": "Microsoft.Exchange.Management.ReportingTask.InvalidExpressionException",
"X-Content-Type-Options": "nosniff",
"DataServiceVersion": "3.0;",
"X-RWS-Version": "2013-V1",
"X-DiagInfo": "CWXP265MB0119",
"X-BEServer": "CWXP265MB0119",
"X-UA-Compatible": "IE=10",
"Strict-Transport-Security": "max-age=31536000; includeSubDomains",
"X-Proxy-RoutingCorrectness": "1",
"X-Proxy-BackendServerStatus": "400",
"X-FEServer": "AM6P192CA0010",
"Cache-Control": "no-store, no-cache",
"Date": "Fri, 19 Jun 2020 12:42:18 GMT",
"Server": "Microsoft-IIS/10.0",
"X-AspNet-Version": "4.0.30319",
"X-Powered-By": "ASP.NET",
"Content-Length": "102",
"Content-Type": "application/json; odata=minimalmetadata; streaming=true; charset=utf-8"
},
"body": {
"odata.error": {
"code": "InvalidQueryException",
"message": {
"lang": "",
"value": "The query is invalid."
}
}
}
}

 

Copper Contributor

i get this error when i run the logic app:

 

{
"odata.error": {
"code": "UnknownError",
"message": {
"lang": "",
"value": "An error has occurred on the server."
}
}
}

 

this happens during the http api call.

Copper Contributor

Is there a reason Microsoft hasn't created a native connector for 365 mail trace for Sentinel? It boggles the mind they would ask customers to write scripts or functions themselves to bring this data into Sentinel.

Version history
Last update:
‎Jun 03 2020 07:52 AM
Updated by: