SQL Mysteries: Why is my SQL Server experiencing lots of 17830 (TCP 10054) errors?

Published Feb 21 2022 10:07 AM 839 Views
Microsoft
Moved from: bobsql.com

 

I was reviewing a test run this week that had more SQL 17830 (TCP 10054 / 0x2746) errors than I could explain from just kills or login timeouts.

 

I started looking at the output in the connectivity ring buffer and the matching XEvent, error_reported event where error_number = 17830. I thought I would find my answer among the login timing information (Reference: https://blogs.msdn.microsoft.com/bobsql/2019/02/10/sql-mysteries-sql-server-login-timeouts-a-debuggi....) Unfortunately, the timings were always small and not something the SQL Server engine was doing, instead the client was closing the connection. In fact, I was able to break the reproduction down to a single connection using an idle SQL Server so there was no impact from other workloads.

 

Perhaps the problem was a lag on the client or TCP layer?

I was expecting to see the client start the connection by sending the TCP SYN, perhaps taking a long time to get to the server. I thought the network trace might reveal a SYN, a long delay that exceeded the connection timeout and a close (RST) from the client. Instead, what I saw was the SYN (tcp open) followed quickly by the RST (tcp close.)

 

BobDorr_1-1645466834258.png

 

  • May articles explain that a client SYN followed by a server RST usually means the server is not listening on the port. This was not the pattern.
  • Other articles highlight that a client SYN followed by client RST could be something like firewall blocking the outgoing traffic. I turned off the firewall and was still able to reproduce the problem.

What was causing the SQL Server networking client to call the TCP open and then call TCP close without exceeding the connection timeout and without attempting the TDS login activities?

 

To make the scenario a bit more interesting:

  • The pattern was only happening when using SQLClient connections, ODBC connections didn’t exhibit the same behavior
  • The pattern only happened on certain lab systems

With some help from Dylan and Brian I was able to break down a reproduction to a Powershell script using SQLClient to connect to SQL Server.

 

Powershell Script

while (1 -eq 1)

{

    $connectionString = ‘Data Source=MyServer,1433;database=master;User ID=sa;Password=xxxxxxxxx;Pooling=False’

    $sqlConnection = New-Object System.Data.SqlClient.SqlConnection $connectionString

    $sqlConnection.Open()

    $sqlConnection.Close()

}

 

Attaching the debugger to the powershell repro I narrowed the issue to a close being called during the open.

 

ws2_32.dll!closesocket
System.Data.dll!TcpConnection::CloseOutstandingSocket
System.Data.dll!Tcp::SocketOpenParallel
System.Data.dll!Tcp::ParallelOpen
System.Data.dll!Tcp::Open
System.Data.dll!SNIOpenSync
System.Data.dll!Connect

What I discovered was the SQLClient attempting to connect to the server using transparent networking Ip resolution. The servers exhibiting the behavior had multiple IP addresses registered with DNS and the servers that didn’t exhibit the behavior had a single IP address registration. The SQLClient attempts to connect to all IP addresses returned from getaddrinfo, in parallel, if the target IP address registration count is between 2 and 64. The SQLClient starts asynchronous, tcp open (SYN) requests to the listed IP addresses. The first connection to accept the SYN ACK from the server (accept the open request) wins and the other, parallel, open requests are closed (RST.) This explains why I always saw a burst of open requests (SYN) to the server but only one of them succeeded and others were closed (RST.)

The SQLClient provides a connection property to control the transparent network ip resolution behavior. 

 

Property = TransparentNetworkIPResolution : https://docs.microsoft.com/en-us/dotnet/api/system.data.sqlclient.sqlconnection.connectionstring?vie...

 

I am not recommending you run out and disable the transparent network ip resolution on your clients....

 

I wrote this blog to help you troubleshoot additional 10054 error patterns. By pointing out the behavior of the SYN, RST occurring from the client, in a back-to-back fashion, you are able to filter these (treat as noise) and focus on other issues that might be causing 10054 errors on your system.

%3CLINGO-SUB%20id%3D%22lingo-sub-3194731%22%20slang%3D%22en-US%22%3ESQL%20Mysteries%3A%20Why%20is%20my%20SQL%20Server%20experiencing%20lots%20of%2017830%20(TCP%2010054)%20errors%3F%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-3194731%22%20slang%3D%22en-US%22%3E%3CH6%20id%3D%22toc-hId-1769107808%22%20id%3D%22toc-hId-1769107904%22%3EMoved%20from%3A%20bobsql.com%3C%2FH6%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EI%20was%20reviewing%20a%20test%20run%20this%20week%20that%20had%20more%20SQL%2017830%20(TCP%2010054%20%2F%200x2746)%20errors%20than%20I%20could%20explain%20from%20just%20kills%20or%20login%20timeouts.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EI%20started%20looking%20at%20the%20output%20in%20the%20connectivity%20ring%20buffer%20and%20the%20matching%20XEvent%2C%20error_reported%20event%20where%20error_number%20%3D%2017830.%20I%20thought%20I%20would%20find%20my%20answer%20among%20the%20login%20timing%20information%20(Reference%3A%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3CA%20href%3D%22https%3A%2F%2Fblogs.msdn.microsoft.com%2Fbobsql%2F2019%2F02%2F10%2Fsql-mysteries-sql-server-login-timeouts-a-debugging-story%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3Ehttps%3A%2F%2Fblogs.msdn.microsoft.com%2Fbobsql%2F2019%2F02%2F10%2Fsql-mysteries-sql-server-login-timeouts-a-debugging-story%3C%2FA%3E.)%20Unfortunately%2C%20the%20timings%20were%20always%20small%20and%20not%20something%20the%20SQL%20Server%20engine%20was%20doing%2C%20instead%20the%20client%20was%20closing%20the%20connection.%20In%20fact%2C%20I%20was%20able%20to%20break%20the%20reproduction%20down%20to%20a%20single%20connection%20using%20an%20idle%20SQL%20Server%20so%20there%20was%20no%20impact%20from%20other%20workloads.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CSTRONG%3EPerhaps%20the%20problem%20was%20a%20lag%20on%20the%20client%20or%20TCP%20layer%3F%3CBR%20%2F%3E%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3EI%20was%20expecting%20to%20see%20the%20client%20start%20the%20connection%20by%20sending%20the%20TCP%20SYN%2C%20perhaps%20taking%20a%20long%20time%20to%20get%20to%20the%20server.%20I%20thought%20the%20network%20trace%20might%20reveal%20a%20SYN%2C%20a%20long%20delay%20that%20exceeded%20the%20connection%20timeout%20and%20a%20close%20(RST)%20from%20the%20client.%20Instead%2C%20what%20I%20saw%20was%20the%20SYN%20(tcp%20open)%20followed%20quickly%20by%20the%20RST%20(tcp%20close.)%3C%2FP%3E%0A%3CDIV%20id%3D%22tinyMceEditorBobDorr_0%22%20class%3D%22mceNonEditable%20lia-copypaste-placeholder%22%3E%26nbsp%3B%3C%2FDIV%3E%0A%3CP%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%22BobDorr_1-1645466834258.png%22%20style%3D%22width%3A%20693px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F350082iEEA8859E85DF288C%2Fimage-dimensions%2F693x26%3Fv%3Dv2%22%20width%3D%22693%22%20height%3D%2226%22%20role%3D%22button%22%20title%3D%22BobDorr_1-1645466834258.png%22%20alt%3D%22BobDorr_1-1645466834258.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3EMay%20articles%20explain%20that%20a%20client%20SYN%20followed%20by%20a%20server%20RST%20usually%20means%20the%20server%20is%20not%20listening%20on%20the%20port.%20This%20was%20not%20the%20pattern.%3C%2FLI%3E%0A%3CLI%3EOther%20articles%20highlight%20that%20a%20client%20SYN%20followed%20by%20client%20RST%20could%20be%20something%20like%20firewall%20blocking%20the%20outgoing%20traffic.%20I%20turned%20off%20the%20firewall%20and%20was%20still%20able%20to%20reproduce%20the%20problem.%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%3CSTRONG%3EWhat%20was%20causing%20the%20SQL%20Server%20networking%20client%20to%20call%20the%20TCP%20open%20and%20then%20call%20TCP%20close%20without%20exceeding%20the%20connection%20timeout%20and%20without%20attempting%20the%20TDS%20login%20activities%3F%3C%2FSTRONG%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3ETo%20make%20the%20scenario%20a%20bit%20more%20interesting%3A%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3EThe%20pattern%20was%20only%20happening%20when%20using%20SQLClient%20connections%2C%20ODBC%20connections%20didn%E2%80%99t%20exhibit%20the%20same%20behavior%3C%2FLI%3E%0A%3CLI%3EThe%20pattern%20only%20happened%20on%20certain%20lab%20systems%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3EWith%20some%20help%20from%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3CEM%3EDylan%3C%2FEM%3E%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3Eand%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3CEM%3EBrian%3C%2FEM%3E%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3EI%20was%20able%20to%20break%20down%20a%20reproduction%20to%20a%20Powershell%20script%20using%20SQLClient%20to%20connect%20to%20SQL%20Server.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CDIV%3E%0A%3CTABLE%20border%3D%220%22%3E%3CCOLGROUP%3E%3CCOL%20%2F%3E%3C%2FCOLGROUP%3E%0A%3CTBODY%20valign%3D%22top%22%3E%0A%3CTR%3E%0A%3CTD%3E%3CSTRONG%3EPowershell%20Script%3C%2FSTRONG%3E%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3CTR%3E%0A%3CTD%3E%3CP%3E%3CSPAN%3Ewhile%20(1%20-eq%201)%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%7B%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%26nbsp%3B%26nbsp%3B%26nbsp%3B%20%24connectionString%20%3D%20%E2%80%98Data%20Source%3DMyServer%2C1433%3Bdatabase%3Dmaster%3BUser%20ID%3Dsa%3BPassword%3Dxxxxxxxxx%3BPooling%3DFalse%E2%80%99%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%26nbsp%3B%26nbsp%3B%26nbsp%3B%26nbsp%3B%24sqlConnection%20%3D%20New-Object%20System.Data.SqlClient.SqlConnection%20%24connectionString%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%26nbsp%3B%26nbsp%3B%26nbsp%3B%20%24sqlConnection.Open()%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%26nbsp%3B%26nbsp%3B%26nbsp%3B%20%24sqlConnection.Close()%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%3CSPAN%3E%7D%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3C%2FTD%3E%0A%3C%2FTR%3E%0A%3C%2FTBODY%3E%0A%3C%2FTABLE%3E%0A%3C%2FDIV%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EAttaching%20the%20debugger%20to%20the%20powershell%20repro%20I%20narrowed%20the%20issue%20to%20a%20close%20being%20called%20during%20the%20open.%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CPRE%3E%3CSPAN%3Ews2_32.dll!closesocket%3CBR%20%2F%3ESystem.Data.dll!TcpConnection%3A%3ACloseOutstandingSocket%3CBR%20%2F%3ESystem.Data.dll!%3CSTRONG%3ETcp%3A%3A%3C%2FSTRONG%3ESocketOpenParallel%3CBR%20%2F%3ESystem.Data.dll!Tcp%3A%3AParallelOpen%3CBR%20%2F%3ESystem.Data.dll!Tcp%3A%3AOpen%3CBR%20%2F%3ESystem.Data.dll!SNIOpenSync%3CBR%20%2F%3ESystem.Data.dll!Connect%3CBR%20%2F%3E%3C%2FSPAN%3E%3C%2FPRE%3E%0A%3CP%3EWhat%20I%20discovered%20was%20the%20SQLClient%20attempting%20to%20connect%20to%20the%20server%20using%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3CSTRONG%3E%3CEM%3Etransparent%20networking%20Ip%20resolution%3C%2FEM%3E%3C%2FSTRONG%3E.%20The%20servers%20exhibiting%20the%20behavior%20had%20multiple%20IP%20addresses%20registered%20with%20DNS%20and%20the%20servers%20that%20didn%E2%80%99t%20exhibit%20the%20behavior%20had%20a%20single%20IP%20address%20registration.%20The%20SQLClient%20attempts%20to%20connect%20to%20all%20IP%20addresses%20returned%20from%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fwindows%2Fwin32%2Fapi%2Fws2tcpip%2Fnf-ws2tcpip-getaddrinfo%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3Egetaddrinfo%3C%2FA%3E%2C%20in%20parallel%2C%20if%20the%20target%20IP%20address%20registration%20count%20is%20between%202%20and%2064.%20The%20SQLClient%20starts%20asynchronous%2C%20tcp%20open%20(SYN)%20requests%20to%20the%20listed%20IP%20addresses.%20The%20first%20connection%20to%20accept%20the%20SYN%20ACK%20from%20the%20server%20(accept%20the%20open%20request)%20wins%20and%20the%20other%2C%20parallel%2C%20open%20requests%20are%20closed%20(RST.)%20This%20explains%20why%20I%20always%20saw%20a%20burst%20of%20open%20requests%20(SYN)%20to%20the%20server%20but%20only%20one%20of%20them%20succeeded%20and%20others%20were%20closed%20(RST.)%3C%2FP%3E%0A%3CP%3EThe%20SQLClient%20provides%20a%20connection%20property%20to%20control%20the%20transparent%20network%20ip%20resolution%20behavior.%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CSTRONG%3E%3CSPAN%3EProperty%20%3D%26nbsp%3B%3C%2FSPAN%3E%3CEM%3ETransparentNetworkIPResolution%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3C%2FEM%3E%3A%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3C%2FSTRONG%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fdotnet%2Fapi%2Fsystem.data.sqlclient.sqlconnection.connectionstring%3Fview%3Dnetframework-4.8%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3E%3CSPAN%3Ehttps%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fdotnet%2Fapi%2Fsystem.data.sqlclient.sqlconnection.connectionstring%3Fview%3Dnetframework-4.8%3CBR%20%2F%3E%3C%2FSPAN%3ESetting%20the%20properly%20to%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3E%3CSTRONG%3Efalse%3C%2FSTRONG%3E%3CSPAN%3E%26nbsp%3B%3C%2FSPAN%3Ein%20the%20connection%20string%20disables%20the%20parallel%20activities%20removing%20the%20SYN%2C%20RST%2017830%20(10054)%20pattern%20for%20the%20secondary%20connection%20attempts.%3CBR%20%2F%3E%3C%2FA%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fdotnet%2Fapi%2Fsystem.data.sqlclient.sqlconnection.connectionstring%3Fview%3Dnetframework-4.8%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%22%3E%3CSPAN%3E%3CSTRONG%3E%3CEM%3EI%20am%20not%20recommending%20you%20run%20out%20and%20disable%20the%20transparent%20network%20ip%20resolution%20on%20your%20clients.%3C%2FEM%3E%26nbsp%3BThe%20design%20where%20the%20first%20connection%20to%20be%20opened%20wins%20is%20a%20nice%20addition%20to%20the%20connection%20capabilities.%3CBR%20%2F%3E%3C%2FSTRONG%3E%3C%2FSPAN%3E%3C%2FA%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EI%20wrote%20this%20blog%20to%20help%20you%20troubleshoot%20additional%2010054%20error%20patterns.%20By%20pointing%20out%20the%20behavior%20of%20the%20SYN%2C%20RST%20occurring%20from%20the%20client%2C%20in%20a%20back-to-back%20fashion%2C%20you%20are%20able%20to%20filter%20these%20(treat%20as%20noise)%20and%20focus%20on%20other%20issues%20that%20might%20be%20causing%2010054%20errors%20on%20your%20system.%3C%2FP%3E%0A%3CP%3E%3CLI-WRAPPER%3E%3C%2FLI-WRAPPER%3E%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-TEASER%20id%3D%22lingo-teaser-3194731%22%20slang%3D%22en-US%22%3E%3CH3%20class%3D%22single-title%22%20id%3D%22toc-hId-1057540222%22%3E%3CSTRONG%3ESQL%20Mysteries%3A%20Why%20is%20my%20SQL%20Server%20experiencing%20lots%20of%2017830%20(TCP%2010054)%20errors%3F%3C%2FSTRONG%3E%3C%2FH3%3E%3C%2FLINGO-TEASER%3E%3CLINGO-LABS%20id%3D%22lingo-labs-3194731%22%20slang%3D%22en-US%22%3E%3CLINGO-LABEL%3EBobSQL%3C%2FLINGO-LABEL%3E%3C%2FLINGO-LABS%3E
Co-Authors
Version history
Last update:
‎Feb 21 2022 10:07 AM
Updated by: