Error installing ATP Sensor on 2019 DC

Copper Contributor

I tried to install the ATP sensor on a 2019, i have the following issue:

 

2022-02-03 04:12:40.8978 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetSerializableMembers failed [AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null exception.Message=Could not load file or assembly 'Microsoft.Azure.Security.Detection.AlertContracts, Version=5.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:41.5541 Info Program Main Deployer started [arguments=VGmKamZFKTHijyHvdXBC+Q==]
2022-02-03 04:12:41.6322 Debug PcapLibraryHelper IsNpcapRunning npcap service exist
2022-02-03 04:12:41.6322 Debug InstallActionGroup Apply started
2022-02-03 04:12:41.6322 Debug CreateCertificateAction Apply started [suppressFailure=False]
2022-02-03 04:12:42.1322 Debug CreateCertificateAction Apply finished
2022-02-03 04:12:42.1322 Debug CreateSensorAction Apply started [suppressFailure=False]
2022-02-03 04:12:42.2885 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetExportedTypes failed [assembly=Microsoft.AspNetCore.Http.Abstractions, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 exception.Message=Could not load file or assembly 'Microsoft.AspNetCore.Http.Features, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:42.3353 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetSerializableMembers failed [AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null exception.Message=Could not load file or assembly 'Microsoft.Azure.Security.Detection.AlertContracts, Version=5.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:42.8354 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetExportedTypes failed [assembly=Microsoft.AspNetCore.Http.Abstractions, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 exception.Message=Could not load file or assembly 'Microsoft.AspNetCore.Http.Features, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:42.8822 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetSerializableMembers failed [AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null exception.Message=Could not load file or assembly 'Microsoft.Azure.Security.Detection.AlertContracts, Version=5.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:43.1406 Debug CreateSensorAction Apply finished
2022-02-03 04:12:43.1406 Debug SaveSensorMandatoryConfigurationAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.1719 Debug SaveSensorMandatoryConfigurationAction Apply finished
2022-02-03 04:12:43.1719 Debug CreateServicesActionGroup Apply started
2022-02-03 04:12:43.1719 Debug CreateServiceAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.1875 Debug CreateServiceAction Apply finished
2022-02-03 04:12:43.1875 Debug SetServiceDescriptionAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2031 Debug SetServiceDescriptionAction Apply finished
2022-02-03 04:12:43.2031 Debug ConfigureServiceAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2188 Debug ConfigureServiceAction Apply finished
2022-02-03 04:12:43.2188 Debug SetServicePreshutdownTimeoutAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2344 Debug SetServicePreshutdownTimeoutAction Apply finished
2022-02-03 04:12:43.2344 Debug CreateServiceAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2344 Debug CreateServiceAction Apply finished
2022-02-03 04:12:43.2344 Debug SetServiceDescriptionAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2344 Debug SetServiceDescriptionAction Apply finished
2022-02-03 04:12:43.2344 Debug ConfigureServiceAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2500 Debug ConfigureServiceAction Apply finished
2022-02-03 04:12:43.2500 Debug SetServicePreshutdownTimeoutAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.2656 Debug SetServicePreshutdownTimeoutAction Apply finished
2022-02-03 04:12:43.2656 Debug CreateServicesActionGroup Apply finished
2022-02-03 04:12:43.2656 Debug ConfigureVirtualServiceAccountAction Apply started [suppressFailure=False]
2022-02-03 04:12:43.3281 Debug InstallActionGroup Revert started
2022-02-03 04:12:43.3281 Warn InstallActionGroup Revert reverting [rollbackAction=CreateServicesActionGroup index=0 count=4]
2022-02-03 04:12:43.3281 Debug CreateServicesActionGroup Revert started
2022-02-03 04:12:43.3281 Warn CreateServicesActionGroup Revert reverting [rollbackAction=SetServicePreshutdownTimeoutAction index=0 count=8]
2022-02-03 04:12:43.3281 Debug SetServicePreshutdownTimeoutAction Revert started
2022-02-03 04:12:43.3281 Debug SetServicePreshutdownTimeoutAction Revert finished
2022-02-03 04:12:43.3281 Warn CreateServicesActionGroup Revert reverting [rollbackAction=ConfigureServiceAction index=1 count=8]
2022-02-03 04:12:43.3281 Debug ConfigureServiceAction Revert started
2022-02-03 04:12:43.3281 Debug ConfigureServiceAction Revert finished
2022-02-03 04:12:43.3281 Warn CreateServicesActionGroup Revert reverting [rollbackAction=SetServiceDescriptionAction index=2 count=8]
2022-02-03 04:12:43.3281 Debug SetServiceDescriptionAction Revert started
2022-02-03 04:12:43.3281 Debug SetServiceDescriptionAction Revert finished
2022-02-03 04:12:43.3281 Warn CreateServicesActionGroup Revert reverting [rollbackAction=CreateServiceAction index=3 count=8]
2022-02-03 04:12:43.3281 Debug CreateServiceAction Revert started
2022-02-03 04:12:43.3750 Debug ServiceControllerExtension DeleteService succeeded [name=AATPSensor]
2022-02-03 04:12:43.3750 Debug CreateServiceAction Revert finished
2022-02-03 04:12:43.3750 Warn CreateServicesActionGroup Revert reverting [rollbackAction=SetServicePreshutdownTimeoutAction index=4 count=8]
2022-02-03 04:12:43.3750 Debug SetServicePreshutdownTimeoutAction Revert started
2022-02-03 04:12:43.3750 Debug SetServicePreshutdownTimeoutAction Revert finished
2022-02-03 04:12:43.3750 Warn CreateServicesActionGroup Revert reverting [rollbackAction=ConfigureServiceAction index=5 count=8]
2022-02-03 04:12:43.3750 Debug ConfigureServiceAction Revert started
2022-02-03 04:12:43.3750 Debug ConfigureServiceAction Revert finished
2022-02-03 04:12:43.3750 Warn CreateServicesActionGroup Revert reverting [rollbackAction=SetServiceDescriptionAction index=6 count=8]
2022-02-03 04:12:43.3750 Debug SetServiceDescriptionAction Revert started
2022-02-03 04:12:43.3750 Debug SetServiceDescriptionAction Revert finished
2022-02-03 04:12:43.3750 Warn CreateServicesActionGroup Revert reverting [rollbackAction=CreateServiceAction index=7 count=8]
2022-02-03 04:12:43.3750 Debug CreateServiceAction Revert started
2022-02-03 04:12:43.4063 Debug ServiceControllerExtension DeleteService succeeded [name=AATPSensorUpdater]
2022-02-03 04:12:43.4063 Debug CreateServiceAction Revert finished
2022-02-03 04:12:43.4063 Debug CreateServicesActionGroup Revert finished
2022-02-03 04:12:43.4063 Warn InstallActionGroup Revert reverting [rollbackAction=SaveSensorMandatoryConfigurationAction index=1 count=4]
2022-02-03 04:12:43.4063 Debug SaveSensorMandatoryConfigurationAction Revert started
2022-02-03 04:12:43.4063 Debug SaveSensorMandatoryConfigurationAction Revert finished
2022-02-03 04:12:43.4063 Warn InstallActionGroup Revert reverting [rollbackAction=CreateSensorAction index=2 count=4]
2022-02-03 04:12:43.4063 Debug CreateSensorAction Revert started
2022-02-03 04:12:43.4063 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetExportedTypes failed [assembly=Microsoft.AspNetCore.Http.Abstractions, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 exception.Message=Could not load file or assembly 'Microsoft.AspNetCore.Http.Features, Version=2.2.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:43.4219 Warn JsonSerializerSettingsExtension+JsonSerializationBinder UpdateCurrentDomainAssemblyTypes GetSerializableMembers failed [AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null AssemblyQualifiedName=Microsoft.Tri.Common.RemediationActionData, Microsoft.Tri.Common, Version=2.171.14934.11377, Culture=neutral, PublicKeyToken=null exception.Message=Could not load file or assembly 'Microsoft.Azure.Security.Detection.AlertContracts, Version=5.3.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.]
2022-02-03 04:12:43.9376 Debug CreateSensorAction Revert finished
2022-02-03 04:12:43.9376 Warn InstallActionGroup Revert reverting [rollbackAction=CreateCertificateAction index=3 count=4]
2022-02-03 04:12:43.9376 Debug CreateCertificateAction Revert started
2022-02-03 04:12:43.9376 Debug CreateCertificateAction Revert finished
2022-02-03 04:12:43.9376 Debug InstallActionGroup Revert finished
2022-02-03 04:12:44.0626 Error EventLogException Deployer failed [arguments=VGmKamZFKTHijyHvdXBC+Q==]
System.UnauthorizedAccessException: Attempted to perform an unauthorized operation.
at void System.Diagnostics.Eventing.Reader.EventLogException.Throw(int errorCode)
at void System.Diagnostics.Eventing.Reader.NativeWrapper.EvtSaveChannelConfig(EventLogHandle channelConfig, int flags)
at bool Microsoft.Tri.Sensor.Deployment.Deployer.ConfigureVirtualServiceAccountAction.ApplyInternal()
at void Microsoft.Tri.Sensor.Common.DeploymentAction.Apply(bool suppressFailure)
at void Microsoft.Tri.Sensor.Common.DeploymentActionGroup.Apply(bool suppressFailure)
at int Microsoft.Tri.Sensor.Deployment.Deployer.Program.Main(string[] commandLineArguments)

13 Replies

@Mlourh 

We've seen this a couple of times now. It occurs only on Windows Server 2019, with kb5009557 installed.
We'll release an update to the sensor installation probably next week to address this issue too.

@Martin_Schvartzman  Please keep us in the loop, I'm encountering the same issues here, and really need this, to continue with DC deployment.  Using 2019 server as well...

And kb5009557 is present..

PS C:\windows\system32> Get-WUHistory | ? { $_.title -match 'kb5009557' }

ComputerName Operationname  Result     Date                Title
------------ -------------  ------     ----                -----
<DCNAME>     Installation   Succeeded  1/21/2022 12:31:... 2022-01 Cumulative Update for Windows Server 2019 (1809) for x64-based Systems (KB5009557)

 

 

@Martin_Schvartzman same issue here with KB5009616 installed on my 2019 DC in Azure.

I found a solution to this issue by running the installer within SYSTEM privileges. I did it with a Task Schedule on both of my failing servers successfully. :stareyes:

@Carl_Chabot @Mlourh @MaximeRastello @ZantenB 
We already added this to our documentation:
https://docs.microsoft.com/en-us/defender-for-identity/troubleshooting-known-issues#problem-installi...


The updated installer should be released sometime next week, or the one after that.

@Carl_ChabotThank you! That solved our issue with the installation on 2019 domain controllers in Google Cloud

@ChordRagingMoon @Carl_Chabot @MaximeRastello @ZantenB 

 

https://docs.microsoft.com/en-us/defender-for-identity/whats-new#defender-for-identity-release-2173

We've released the updated installation package. Please note it may take a couple of days to reach your sensor download page.

@Martin_Schvartzman has anything changed yet again?

we AGAIN have problems installing the agent on new Domain Controllers

The network connection is tested and ok.

I must say I'm getting a little tired of keeping on troubleshooting the installation of this specific agent while so many other agents install without problems.

why doesn't the agent just install initially, then it can start complaining  but at least we have something to work with.
the current error in the setup logfile is :

[2964:292C][2022-07-18T10:13:23]i000: 2022-07-18 08:13:23.7360 Debug DeploymentModel .ctor [\[]IsAfterRestartAndConfigured=False[\]]
[2964:26C8][2022-07-18T10:13:24]i000: 2022-07-18 08:13:24.5014 Error DeploymentModel ValidateCreateSensorAsync Microsoft.Tri.Infrastructure.ExtendedException: Sanitized exception: [\[]Type=System.Net.Http.HttpRequestExceptionMessage=kZbHZ02cunBcHiKyFrnbkg==StackTrace= at async Task<HttpResponseMessage> System.Net.Http.HttpClient.FinishSendAsyncBuffered(Task<HttpResponseMessage> sendTask, HttpRequestMessage request, CancellationTokenSource cts, bool disposeCts)
at async Task<TResponse> Microsoft.Tri.Common.CommunicationWebClient.SendAsync<TResponse>(byte[\[][\]] requestBytes, int offset, int count)
at async Task<TResponse> Microsoft.Tri.Common.CommunicationWebClient.SendWithRetryAsync<TResponse>(byte[\[][\]] requestBytes, int offset, int count)InnerException=Microsoft.Tri.Infrastructure.ExtendedException: Sanitized exception: [\[]Type=System.Net.WebExceptionMessage=YWn4O7TiLMlSvbtZSOMfpg==StackTrace= at Stream System.Net.HttpWebRequest.EndGetRequestStream(IAsyncResult asyncResult, out TransportContext context)
at void System.Net.Http.HttpClientHandler.GetRequestStreamCallback(IAsyncResult ar)InnerException=Microsoft.Tri.Infrastructure.ExtendedException: Sanitized exception: [\[]Type=System.IO.IOExceptionMessage=nBkgxS0EDE8CUEg8Ec4cXw==StackTrace= at void System.Net.TlsStream.EndWrite(IAsyncResult asyncResult)
at void System.Net.PooledStream.EndWrite(IAsyncResult asyncResult)
at void System.Net.ConnectStream.WriteHeadersCallback(IAsyncResult ar)InnerException=Microsoft.Tri.Infrastructure.ExtendedException: Sanitized exception: [\[]Type=System.Net.Sockets.SocketExceptionMessage=kPgB8WP+JwtA6gCzvetX8A==StackTrace= at int System.Net.Sockets.Socket.EndReceive(IAsyncResult asyncResult)
at int System.Net.Sockets.NetworkStream.EndRead(IAsyncResult asyncResult)InnerException=[\]][\]][\]][\]]
at Microsoft.Tri.Common.CommunicationWebClient.<SendWithRetryAsync>d__9`1.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Tri.Common.CommunicationWebClient.<SendAsync>d__7.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Tri.Sensor.Common.WorkspaceApplicationSensorApiDeploymentProxy.<SendAsync>d__5.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.Tri.Sensor.Deployment.Bundle.UI.DeploymentModel.<ValidateCreateSensorAsync>d__52.MoveNext() failed connecting to service. The issue can be caused by a transparent proxy configuration [\[]WorkspaceApplicationSensorApiEndpoint=Unspecified/***REMOVED***sensorapi.atp.azure.com:443[\]]

 

note: the 443 port connection to *sensorapi.atp.azure.com  is working..

@ZantenB 

The sensor installation has 2 phases of connecting to the backend.
The first one is authenticating using the access key supplied to register the sensor machine.
The second one is after the sensor was registered and was issued a certificate, we call
"ValidateCreateSensorAsync" which is the fist time we try to authenticate with the cert the sensor does during runtime via mutual authentication.
According to the stack, this is where you fail, so most likely it's not a matter of connecting to the endpoint, but most likely you have SSL inspection that interferes with the mutual authentication, or missing root certificates that won't allow you to validate certs correctly.

In theory , we could "force install" the sensor and fail with a similar error right after that, how would that give you "something to work with" that we don't get here?

As to why other agents do not fail, I can't comment without knowing their implementation, my guess is that there are not using mutual authentication, but it's just a guess.

@Eli Ofek thanks for you reply

I'm also asking our network team whether a HTTPS inspection is taking place in the communication, that is most likely the cause.

for me personally it would help if at least the agent can be installed without further error so we can troubleshoot the network later.   for now our automation tends to fail if one of the agents is not installing, like in this case, causing the rest of the automation to be aborted.

 

Regards, Ben

Hi all. it is solved for us by the firewall team.
Although they mention there is NO SSL inspection on that communication, AND we had a successful powershell; test-netconnection to port 443 on the endpoints, they had changed 'some other rule' and suddenly it worked for us.
Thanks, Ben

@ZantenB  - I've been working with my firewall team for days now on this exact issue. The initial though was the CRL validation of the certificate was failing as a result of the OSCP connection over port 80. We opened everything over TCP Port 443 and port 80 to internet to see if that would work.... still no dice. Same error. 

 

Our issue only seems to be in UK South and Japan East. Seems the ATP Service uses different ROOT certificates depending on where your located. 

 

Anyone have more information on this? 

 

 

 

 

 

@khetheri-admin 

MDI sensors do not use port 80 for any outbound connectivity to the internet. Only 443.

We use the same certificates WW, and we're also not deployed yet in Japan, so I'm not sure what you are referring to.

It would be best if you open a support ticket through the portal.