server 2019
28 TopicsServer 2019 Domain Controllers: lsass.exe terminated unexpectedly with status code -1073741819
Basically my issue matches https://learn.microsoft.com/en-us/answers/questions/612097/windwos-2019-lsass-exe-terminated-unexpectedly-wit?source=docs exactly. We have Server 2019 DCs running on VMware vSphere 7.0 U3c. The non-PDC DCs are randomly rebooting with the below event log message: EventID : 1074 MachineName : DC19** Data : {} Index : 544467 Category : (0) EntryType : Information Message : The process wininit.exe has initiated the restart of computer DC19RP on behalf of user for the following reason: No title for this reason could be found Reason Code: 0x50006 Shutdown Type: restart Comment: The system process 'C:\Windows\system32\lsass.exe' terminated unexpectedly with status code -1073741819. The system will now shut down and restart. Source : User32 ReplacementStrings : {wininit.exe, DC19**, No title for this reason could be found, 0x50006...} InstanceId : 2147484722 TimeGenerated : 4/23/2023 5:07:58 AM TimeWritten : 4/23/2023 5:07:58 AM UserName : NT AUTHORITY\SYSTEM The servers are all patched to the current CU - 2023-04 (KB5025229), so they should all have the most recent KB I've found that addresses lsass.exe crashes (KB5010791) installed. I've also noticed that shortly before the lsass.exe crash, there will be an event log similar to the one below, although each references a different WMI filter: EventID : 1065 MachineName : DC19** Data : {} Index : 544466 Category : (0) CategoryNumber : 0 EntryType : Error Message : The processing of Group Policy failed. Windows could not evaluate the Windows Management Instrumentation (WMI) filter for the Group Policy object cn={***},cn=policies,cn=system,DC=fabrikam,DC=com. This could be caused by RSOP being disabled or Windows Management Instrumentation (WMI) service being disabled, stopped, or other WMI errors. Make sure the WMI service is started and the startup type is set to automatic. New Group Policy objects or settings will not process until this event has been resolved. Source : Microsoft-Windows-GroupPolicy ReplacementStrings : {4, 714, 0, 136750...} InstanceId : 1065 TimeGenerated : 4/23/2023 5:07:58 AM TimeWritten : 4/23/2023 5:07:58 AM UserName : NT AUTHORITY\SYSTEM Once the server is back up and running after the reboot crash, WMI appears to be working fine, and I'm not seeing any other errors specifically referencing WMI itself in the period leading up to the crash.4.1KViews1like2CommentsServer 2019 ADFS LDAP Errors After Installing January 2022 Patch KB5009557
As it stands now, it appears that KB5009557 breaks 'something' with the connection between ADFS and AD. When this happens you are unable to SSO until the ADFS server is rebooted (sometimes it takes several times). We started getting errors (I'll paste the error below) after installing 5009557, and as soon as it pops up, you will get them continually until a reboot. However if/when the reboot does fix it, it will only be temporary as it seems that at some point (maybe when the kerberos ticket needs to be refreshed??) that it will break again. Right now our heavy hitter is our Sharepoint relying party so that will be shown in the error below. On one occasion ADFS did break when I rebooted a few domain controllers. We are currently using a gMSA and not a traditional service account. We have validated that other systems are able to query the domain via LDAP connections successfully with a gMSA after installing the January patches. This is only affecting the ADFS servers. The ADFS servers are still able to retrieve the gMSA password from the domain. Our domain is healthy. No replication errors or any other issues. We do not have any one-way trusts etc. So far the only thing that has worked for us is to uninstall KB5009557, which of course we don't want to do for security reasons. What hasn't worked: Updating the krbtgt password in proper sequence. Installing OOB patch KB5010791. I see that KB5009616 was released on 01/25 and it does mention a few kerberos items but the only thing related to ADFS is: "Addresses an issue that might occur when you enable https://docs.microsoft.com/windows-server/identity/ad-fs/troubleshooting/ad-fs-tshoot-logging and an invalid parameter is logged. As result, Event 207 is logged, which indicates that a failure to write to the audit log occurred." Which isn't our issue. Anyone know if this patch from the 25th resolves it? We're going to install it on one of our ADFS servers as a test. Below is the error seen when the connection between ADFS and AD breaks: Encountered error during federation passive request. Additional Data Protocol Name: wsfed Relying Party: urn:sharepoint:prod Exception details: Microsoft.IdentityServer.RequestFailedException: MSIS7012: An error occurred while processing the request. Contact your administrator for details. ---> Microsoft.IdentityServer.ClaimsPolicy.Language.PolicyEvaluationException: POLICY0018: Query ';tokenGroups,sAMAccountName,mail,userPrincipalName;{0}' to attribute store 'Active Directory' failed: 'The supplied credential is invalid. Error code: 49 Server response message: '. ---> Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapServerUnavailableException: The supplied credential is invalid. Error code: 49 Server response message: ---> System.DirectoryServices.Protocols.LdapException: The supplied credential is invalid. at System.DirectoryServices.Protocols.LdapConnection.BindHelper(NetworkCredential newCredential, Boolean needSetCredential) at Microsoft.IdentityServer.GenericLdap.Channel.ConnectionBaseFactory.GenerateConnection() at Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapConnectionCache.CacheEntry.CreateConnectionHelper(String server, Boolean isGC, LdapConnectionSettings settings) --- End of inner exception stack trace --- at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result) at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result) at Microsoft.IdentityServer.ClaimsPolicy.Language.AttributeLookupIssuanceStatement.OnExecuteQueryComplete(IAsyncResult ar) --- End of inner exception stack trace --- at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result) at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result) at Microsoft.IdentityServer.Web.WSTrust.SecurityTokenServiceManager.Issue(RequestSecurityToken request, IList`1& identityClaimSet, List`1 additionalClaims) at Microsoft.IdentityServer.Web.Protocols.PassiveProtocolHandler.SubmitRequest(MSISRequestSecurityToken request, IList`1& identityClaimCollection) at Microsoft.IdentityServer.Web.Protocols.PassiveProtocolHandler.RequestBearerToken(MSISRequestSecurityToken signInRequest, Uri& replyTo, IList`1& identityClaimCollection) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.RequestBearerToken(MSISSignInRequestMessage signInRequest, SecurityTokenElement onBehalfOf, SecurityToken primaryAuthToken, SecurityToken deviceSecurityToken, String desiredTokenType, WrappedHttpListenerContext httpContext, Boolean isKmsiRequested, Boolean isApplicationProxyTokenRequired, MSISSession& session) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponseCoreWithSerializedToken(MSISSignInRequestMessage wsFederationPassiveRequest, WrappedHttpListenerContext context, SecurityTokenElement signOnTokenElement, Boolean isKmsiRequested, Boolean isApplicationProxyTokenRequired) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponseCoreWithSecurityToken(WSFederationSignInContext context, SecurityToken securityToken, SecurityToken deviceSecurityToken) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponse(WSFederationSignInContext federationPassiveContext, SecurityToken securityToken, SecurityToken deviceSecurityToken) --- End of inner exception stack trace --- at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponse(WSFederationSignInContext federationPassiveContext, SecurityToken securityToken, SecurityToken deviceSecurityToken) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.Process(ProtocolContext context) at Microsoft.IdentityServer.Web.PassiveProtocolListener.ProcessProtocolRequest(ProtocolContext protocolContext, PassiveProtocolHandler protocolHandler) at Microsoft.IdentityServer.Web.PassiveProtocolListener.OnGetContext(WrappedHttpListenerContext context) Microsoft.IdentityServer.ClaimsPolicy.Language.PolicyEvaluationException: POLICY0018: Query ';tokenGroups,sAMAccountName,mail,userPrincipalName;{0}' to attribute store 'Active Directory' failed: 'The supplied credential is invalid. Error code: 49 Server response message: '. ---> Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapServerUnavailableException: The supplied credential is invalid. Error code: 49 Server response message: ---> System.DirectoryServices.Protocols.LdapException: The supplied credential is invalid. at System.DirectoryServices.Protocols.LdapConnection.BindHelper(NetworkCredential newCredential, Boolean needSetCredential) at Microsoft.IdentityServer.GenericLdap.Channel.ConnectionBaseFactory.GenerateConnection() at Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapConnectionCache.CacheEntry.CreateConnectionHelper(String server, Boolean isGC, LdapConnectionSettings settings) --- End of inner exception stack trace --- at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result) at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result) at Microsoft.IdentityServer.ClaimsPolicy.Language.AttributeLookupIssuanceStatement.OnExecuteQueryComplete(IAsyncResult ar) --- End of inner exception stack trace --- at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result) at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result) at Microsoft.IdentityServer.Web.WSTrust.SecurityTokenServiceManager.Issue(RequestSecurityToken request, IList`1& identityClaimSet, List`1 additionalClaims) at Microsoft.IdentityServer.Web.Protocols.PassiveProtocolHandler.SubmitRequest(MSISRequestSecurityToken request, IList`1& identityClaimCollection) at Microsoft.IdentityServer.Web.Protocols.PassiveProtocolHandler.RequestBearerToken(MSISRequestSecurityToken signInRequest, Uri& replyTo, IList`1& identityClaimCollection) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.RequestBearerToken(MSISSignInRequestMessage signInRequest, SecurityTokenElement onBehalfOf, SecurityToken primaryAuthToken, SecurityToken deviceSecurityToken, String desiredTokenType, WrappedHttpListenerContext httpContext, Boolean isKmsiRequested, Boolean isApplicationProxyTokenRequired, MSISSession& session) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponseCoreWithSerializedToken(MSISSignInRequestMessage wsFederationPassiveRequest, WrappedHttpListenerContext context, SecurityTokenElement signOnTokenElement, Boolean isKmsiRequested, Boolean isApplicationProxyTokenRequired) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponseCoreWithSecurityToken(WSFederationSignInContext context, SecurityToken securityToken, SecurityToken deviceSecurityToken) at Microsoft.IdentityServer.Web.Protocols.WSFederation.WSFederationProtocolHandler.BuildSignInResponse(WSFederationSignInContext federationPassiveContext, SecurityToken securityToken, SecurityToken deviceSecurityToken) Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapServerUnavailableException: The supplied credential is invalid. Error code: 49 Server response message: ---> System.DirectoryServices.Protocols.LdapException: The supplied credential is invalid. at System.DirectoryServices.Protocols.LdapConnection.BindHelper(NetworkCredential newCredential, Boolean needSetCredential) at Microsoft.IdentityServer.GenericLdap.Channel.ConnectionBaseFactory.GenerateConnection() at Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapConnectionCache.CacheEntry.CreateConnectionHelper(String server, Boolean isGC, LdapConnectionSettings settings) --- End of inner exception stack trace --- at Microsoft.IdentityModel.Threading.AsyncResult.End(IAsyncResult result) at Microsoft.IdentityModel.Threading.TypedAsyncResult`1.End(IAsyncResult result) at Microsoft.IdentityServer.ClaimsPolicy.Language.AttributeLookupIssuanceStatement.OnExecuteQueryComplete(IAsyncResult ar) System.DirectoryServices.Protocols.LdapException: The supplied credential is invalid. at System.DirectoryServices.Protocols.LdapConnection.BindHelper(NetworkCredential newCredential, Boolean needSetCredential) at Microsoft.IdentityServer.GenericLdap.Channel.ConnectionBaseFactory.GenerateConnection() at Microsoft.IdentityServer.ClaimsPolicy.Engine.AttributeStore.Ldap.LdapConnectionCache.CacheEntry.CreateConnectionHelper(String server, Boolean isGC, LdapConnectionSettings settings)14KViews1like1CommentNo IPv4 access to RDG. Only IPv6
Okay, got a really weird one here. I reluctantly admit it’s 100% my fault too. On a Server 2019 domain controller with RDGateway installed it’s set up so the business owner can remote in from home to a VM set up on the DC. I also can remote in via the RDG directly to the server itself, so I can perform my monthly maintenance remotely. A few weeks ago, in the process of figuring out and setting up IPv6 it seems I somehow disabled IPv4 access to the RDG as a way to confirm it was correctly configured for IPv6. Got RDG and all working with IPv6 now. But for the life of me I can’t remember exactly what I did to disable IPv4 to the gateway. I’ve googled quite a bit, and nothing I find seems to re-enable IPv4 on the RDG. Heck, I’m not even sure it’s RDG where the issue is. I need to re-enable IPv4 so the business owner (and a few other managers) can regain remote access now. Here’s what I’ve checked. -IPv4 is enabled on the network adaptor. -IPv4 is enabled on the VM -Port 443 is not blocked for IPv6 on the firewall -Checked the policies in Gateway Manager and nothing there indicates IPv4 is blocked. -Checked the RD-CAP network policy in NPS and nothing there indicates IPv4 is blocked. I’m at a loss and could use some help in recovering from my own stupidity.823Views1like0CommentsActive Directory DFSR headache
We have 23 DC's, all but one of which are 2012R2. The one-off, I upgraded a couple weeks ago directly from 2012R2 to 2019. For the past year or two we've had 2 DC's that weren't doing SYSVOL replication. I thought I had fixed that before I started with the process of getting them upgraded to 2019, but now that I've done one server, it looks like I was incorrect. So here's what's driving me nuts. Using the "status" tab of the Group Policy Management MMC, things are either horribly FUBAR, or humming along perfectly, depending (apparently) on the OS of the computer I'm running the MMC from. If I run it from a Windows 10 workstation or the Server 2019 DC, things look bad. I show 15 servers with replication "in progress", of which 13 show a status under the SYSVOL column of "Inaccessible", and 2 show a "Contents" issue with a single GPO. If I run the MMC from the 2012R2 DCs or from a Win 8.1 VM I spun up on a hunch, I show all 22 DCs in perfect sync (both AD and SYSVOL) with the baseline DC. When I use a file/folder comparison tool on the contents of the SYSVOL folder for each DC, not one of them matches the contents on the PDC. Although there are no "orphaned" files or folders, the date modified doesn't match on a varying number of files and/or folders for each DC (sometimes off by years). The closest is actually the 2019 DC, which only shows mismatches on the contents of 3 GPOs. The DFSR event logs don't show any regularly occurring errors other than losing replication for a bit between DCs when one goes down for system state backup. I ran a dcdiag /a /c, and didn't see any errors in there aside from the DFS test failing due to the above-mentioned errors caused by backups, some system event log errors due to a deleted computer account, and one DC had a typo in the secondary DNS entry on its network adapter settings. There are also no errors when I run repadmin /showrepl. I've tried running both non-authoritative and authoritative replications using the instructions https://docs.microsoft.com/en-us/troubleshoot/windows-server/group-policy/force-authoritative-non-authoritative-synchronization, and neither made any difference at all. Any suggestions?Solved1.7KViews1like3CommentsWSUS 10.0.17763.678 fails to download any updates since in-place upgrade. Event 10032, 364
Hi everyone, after in-place upgrade from Server 2016 LTSC GUI to 2019 LTSC GUI and running the WSUS post upgrade wizard, the WSUS Server fails to download any updates. Events 364 Error Content file download failed. Reason: Value does not fall within the expected range. Source File: /d/msdownload/update/software/secu/2015/06/sqlserver2014-kb3070446-x64_aab1ac21337a4f0b20d228c21fc7c9eb68418431.exe Destination File: E:\WSUSFiles\WsusContent\31\AAB1AC21337A4F0B20D228C21FC7C9EB68418431.exe 10032 Error The server is failing to download some updates. latest WAM is in place and everything else like Clients contacting and DL works fine. Usual optimizations on Pool etc are in place. I have checked the permissions guidance from Adam J. (btw. the docs article is still missing in en-us) WSUS Permissions - WsusContent, Registry, and IIS | AJ Tek CorporationSolved7.2KViews1like4CommentsDirect Mode didn’t work on ReFS formated Cluster Shared Volumes
Hi, why is direct access not possible with a ReFS format CSV volume? Instead of direct access, all reFS formatted CSVs that are provided by a SAN runs with "FileSystemRedirected". This behavior is critical because the "FileSystem Redirected" mode, compared to direct access, has up to 90% or more performance losses depending on the environment. Neither the fact that ReFS formatted CSV only run in "FileSystem Redirected" mode nor the associated performance penalties are mentioned in any official Microsoft statement. On the contrary, Microsoft actively recommends using ReFS for VHDX files. That is why, like many others, I have formatted the CSV with ReFS since the server in 2016, because I hoped that this would be an advantage for the customer systems. Unfortunately, this procedure led to the opposite and it was not easy to find the reason for it. Mainly because Microsoft has absolutely nothing documented about this behavior. I personally had to postpone more than 100 TB of data on various customer systems over the past few months in order to eliminate this problem and to bring the CSV formatted with ReFS back to NTFS. This action cost my company a considerable amount and also led to massive annoyance of the customer. However, if you know what to look for, you can now find a lot of posts on the Internet that confirm this behavior. Here are a few examples. https://github.com/MicrosoftDocs/windowsserverdocs/issues/2051 https://www.hyper-v-server.de/hypervisor/performance-probleme-hyper-v-cluster-mit-san-storage-und-csvs-mit-refs-formatiert/ https://techcommunity.microsoft.com/t5/failover-clustering/understanding-the-state-of-your-cluster-shared-volumes/ba-p/371889 https://www.windowspro.de/marcel-kueppers/refs-ntfs-vor-nachteile-dateisysteme-server-2016 https://www.wowrack.com/blog/microsofts-latest-system-refs-compared-to-ntfs/ https://4sysops.com/archives/windows-server-2019-cluster-shared-volumes-best-practices/ https://social.technet.microsoft.com/Forums/ie/en-US/6b2dcc4f-e735-4700-81f3-df45d94e7e01/refs-for-a-hyperv-csv-volume?forum=winserverhyperv https://forums.veeam.com/veeam-backup-replication-f2/latest-veeam-community-forums-digest-oct-2-oct-8-t46019.html Therefore, I now spare myself any further details and come directly to my demand. If ReFS does not fundamentally support direct mode, then I also expect Microsoft to publicly clarify it accordingly and also clearly indicate which disadvantages could arise if the CSVs are formatted with ReFS. If it should work and there is only a bug in between, please finally fix it. This problem has existed since Server 2016, so enough time should have passed to fix the problem. Best Regards from Germany Alex1.9KViews1like0Comments