%3CLINGO-SUB%20id%3D%22lingo-sub-1430656%22%20slang%3D%22en-US%22%3EHow%20to%20isolate%20latency%20issue%20for%20Azure%20Storage%20Account%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1430656%22%20slang%3D%22en-US%22%3E%3CP%3E%3CU%3EScenario%3A%20%3C%2FU%3E%3C%2FP%3E%0A%3CP%3ELet%E2%80%99s%20say%2C%20your%20application%20makes%20use%20of%20Azure%20Storage%20Account%20for%20uploading%2Fdownloading%20blobs.%20However%2C%20you%20are%20seeing%20that%20there%20is%20latency%20in%20either%20uploading%20the%20blob%20or%20downloading%20it.%20Though%2C%20there%20has%20been%20no%20change%20in%20your%20application%2C%20you%20are%20unable%20to%20figure%20out%20what%20is%20causing%20the%20issue.%20You%20are%20unsure%20of%20whether%20the%20latency%20is%20part%20of%20the%20application%20or%20from%20Azure%20Storage%20Account.%20%3CBR%20%2F%3E%3CBR%20%2F%3ENow%2C%20let%E2%80%99s%20see%20how%20we%20can%20isolate%20the%20latency%20issue%20and%20understand%20whether%20the%20latency%20is%20from%20Storage%20account%20or%20Client%20Application.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EBefore%2C%20we%20understand%20how%20to%20troubleshoot%20the%20latency%20issue%2C%20let%20us%20understand%20what%20Server%20Latency%20and%20E2E%20Latency%20is%3A%3C%2FP%3E%0A%3CUL%3E%0A%3CLI%3E%3CSTRONG%3EEnd-to-end%20(E2E)%20latency%3C%2FSTRONG%3E%26nbsp%3B%3A%20It%20measures%20the%20interval%20from%20when%20Azure%20Storage%20Server%20receives%20the%20first%20packet%20of%20the%20request%20until%20Azure%20Storage%20Server%20receives%20a%20client%20acknowledgment%20on%20the%20last%20packet%20of%20the%20response.%20In%20simpler%20terms%20it%20means%20the%20round%20trip%20of%20any%20operation%20starting%20at%20the%20client%20application%2C%20plus%20the%20time%20taken%20for%20processing%20the%20request%20at%20Storage%20Server%20and%20then%20coming%20back%20to%20the%20client%20application.%3C%2FLI%3E%0A%3CLI%3E%3CSTRONG%3EServer%20latency%3C%2FSTRONG%3E%3A%20It%20measures%20the%20interval%20from%20when%20Azure%20Storage%20Server%20receives%20the%20last%20packet%20of%20the%20request%20until%20the%20first%20packet%20of%20the%20response%20is%20returned%20from%20Azure%20Storage%20Server.%20In%20simpler%20terms%2C%20it%20means%20the%20time%20taken%20by%20the%20Storage%20Sever%20to%20process%20any%20given%20request.%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3EThe%20first%20starting%20point%20to%20identify%20any%20latency%20issue%2C%20is%20to%20check%20the%20metrices%20available%20on%20the%20Portal%20for%20the%20storage%20account.%20Navigate%20to%20the%20storage%20account%20-%26gt%3B%20Go%20to%20the%20Metrices%20tab%20-%26gt%3B%20Select%20the%20metric%20values%20depending%20on%20which%20Storage%20Service%20you%20are%20using.%20For%20example%2C%20I%20am%20making%20use%20of%20the%20Blob%20Service%20for%20selecting%20the%20metric%20values%3A%3CBR%20%2F%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%221.png%22%20style%3D%22width%3A%20999px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195661i1DDFD2E0AD610EE8%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%221.png%22%20alt%3D%221.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EIn%20usual%20scenarios%2C%20there%20should%20be%20less%20difference%20between%20the%20Sever%20Latency%20and%20E2E%20Latency.%20If%20you%20see%20that%20the%20E2E%20latency%20is%20significantly%20higher%20than%20the%20Server%20Latency%2C%20then%20it%20needs%20investigation%20as%20to%20why%20there%20is%20client-side%20latency.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EIf%20you%20have%20the%20Storage%20Analytical%20Logging%20enabled%20on%20the%20storage%20account%2C%20we%20can%20make%20use%20of%20these%20logs%20to%20find%20whether%20there%20is%20server%20latency%20or%20E2E%20latency.%20You%20can%20know%20more%20about%20the%20Storage%20Analytical%20Logging%20by%20referring%20to%20this%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fstorage%2Fcommon%2Fstorage-analytics-logging%3Ftabs%3Ddotnet%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3Elink%3C%2FA%3E.%20The%20storage%20analytical%20logs%20are%20saved%20in%20the%20%24logs%20container%20and%20can%20be%20accessed%20using%20the%20Storage%20Explorer%20tool.%20Please%20refer%20to%20the%20below%20screenshot%20for%20reference%3A%3C%2FP%3E%0A%3CP%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%222.png%22%20style%3D%22width%3A%20364px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195662iDFD8AAC4E8767F48%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%222.png%22%20alt%3D%222.png%22%20%2F%3E%3C%2FSPAN%3E%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3ENow%2C%20in%20the%20storage%20analytical%20logs%2C%20we%20can%20see%20the%20E2E%20latency%20and%20Sever%20latency%20value%20for%20each%20operation%20done%20on%20the%20storage%20account.%20A%20log%20sample%20is%20as%20below%3A%3CBR%20%2F%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%223.png%22%20style%3D%22width%3A%20999px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195663iE4C7C56907032013%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%223.png%22%20alt%3D%223.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EIn%20the%20logs%2C%20we%20can%20see%20that%20there%20is%20high%20E2E%20latency%20as%20compared%20to%20the%20Server%20Latency.%3C%2FP%3E%0A%3CP%3EThus%2C%20by%20making%20use%20of%20the%20metrices%20available%20on%20Portal%20or%20Storage%20Analytical%20Logs%2C%20we%20can%20understand%20whether%20the%20latency%20seen%20is%20from%20the%20client-side%20or%20from%20the%20Storage%20Server%20side.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3ELet%20us%20assume%20that%20there%20is%20high%20E2E%20latency%20and%20low%20Server%20Latency.%20Now%2C%20how%20are%20we%20going%20to%20isolate%20this%20issue%20further%20i.e.%20how%20to%20understand%20if%20the%20latency%20is%20because%20of%20the%20client%20application%2FVM%E2%80%99s%20performance%20or%2Fand%20there%20is%20issue%20with%20the%20network.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EThe%20common%20attributes%20that%20can%20be%20checked%20to%20see%20if%20there%20is%20performance%20issue%20with%20the%20client%20application%2FVM%20are%3A%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3EHigh%20CPU%20on%20the%20client%3A%20When%20you%20are%20running%20the%20application%2C%20check%20if%20there%20is%20spike%20in%20the%20CPU%20of%20the%20client%20machine.%20If%20the%20CPU%20is%20higher%20than%20expected%2C%20it%20will%20result%20in%20performance%20issue%20for%20the%20application%20and%20in%20turn%20increase%20the%20latency%20value.%3C%2FLI%3E%0A%3CLI%3ELow%20available%20memory%20on%20the%20client%3A%20Check%20if%20you%20are%20running%20low%20on%20memory%20on%20the%20client%20machine.%3C%2FLI%3E%0A%3CLI%3ERunning%20out%20of%20network%20bandwidth%20on%20the%20client.%3C%2FLI%3E%0A%3CLI%3EMisconfigured%20client%20application.%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3EIf%20you%20observe%20any%20of%20the%20above%20parameter%20while%20running%20your%20application%2C%20you%20will%20have%20to%20consider%20increasing%20your%20VM%E2%80%99s%20size.%20Try%20to%20scale%20your%20VM%20and%20check%20if%20there%20is%20any%20latency%20issue%20while%20uploading%2Fdowning%20blob%20to%20storage%20account.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EIf%20you%20find%20any%20performance%20issues%20with%20the%20application%2C%20you%20can%20refer%20to%20the%20performance%20checklist%20mentioned%20in%20the%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fstorage%2Fblobs%2Fstorage-performance-checklist%23net-configuration%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3Elink%3C%2FA%3E.%20Note%20that%20the%20checklist%20is%20for%20the%20.NET%20applications.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EYou%20can%20also%20face%20the%20latency%20issue%20caused%20due%20to%20the%20Azure%20DC%20region%20from%20your%20application's%20location.%20%26nbsp%3BFor%20isolating%20the%20Azure%20DC%20latency%20issue%2C%20you%20can%20follow%20the%20below%20steps%3A%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3ETest%20using%20Azure%20Speed%20meter%20given%20in%20the%20link%3A%20%3CA%20href%3D%22http%3A%2F%2Fwww.azurespeed.com%2FAzure%2FLatency%22%20target%3D%22_blank%22%20rel%3D%22nofollow%20noopener%20noreferrer%20noopener%20noreferrer%22%3Ehttp%3A%2F%2Fwww.azurespeed.com%2FAzure%2FLatency%3C%2FA%3E%3CBR%20%2F%3ESelect%20the%20Azure%20Data%20center%20that%20you%20want%20to%20access%2C%20and%20it%20will%20show%20the%20latency%20for%20it.%20This%20will%20help%20in%20isolating%20the%26nbsp%3Blatency%20issue%20caused%20due%20to%20the%20Azure%20DC%20region%20from%20your%20application's%20location.%3C%2FLI%3E%0A%3CLI%3ETest%20using%20different%20container%2Fblobs%3A%20%3CBR%20%2F%3Ea.%20Create%20a%20new%20container%20(don't%20have%20the%20same%20naming%20convention%20than%20the%20existing%20containers)%26nbsp%3Bin%20the%20same%20storage%20account%20and%20add%20a%20new%20blob%20to%20that%20container%20and%20test%20accessing%20that%20blob.%26nbsp%3B%20This%20will%20help%20eliminate%20any%20storage%20level%20partitioning%20issues.%3CBR%20%2F%3Eb.%20Create%20a%20blob%20in%20your%20storage%20account%20and%20attempt%20to%20access%20that%20blob%20instead%20of%20your%26nbsp%3Bblob%20in%20that%20storage%20account%3C%2FLI%3E%0A%3CLI%3ETest%20using%20the%20Azure%20Storage%20team%E2%80%99s%20reference%20samples%20given%20in%20the%20link%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fsamples%2Fbrowse%2F%3Fredirectedfrom%3DMSDN-samples%26amp%3Bterm%3Dblob%2520storage%22%20target%3D%22_self%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3Ehere%3C%2FA%3E.%3C%2FLI%3E%0A%3CLI%3ECreate%20a%20hosted%20service%20in%20the%20datacenter%20where%20the%20storage%20account%20is%20and%20then%20RDP%20onto%20the%20VM%20and%20test%20throughput%20from%20there.%20Test%20just%20using%20IE%20to%20browse%20to%20the%20blob%2C%20and%20the%20tools%20from%20above.%26nbsp%3B%20You%20will%20most%20likely%20achieve%20very%20fast%20downloads%20which%20will%20indicate%20that%20the%20problem%20is%20the%20client%E2%80%99s%20network%20connection%20or%20application%2C%20and%20not%20Azure.%3C%2FLI%3E%0A%3CLI%3EMake%20sure%20that%20you%20are%20aware%20of%20the%20performance%20targets%20and%20the%20throughput%26nbsp%3Boffered%20in%20storage%20account.%20For%20this%20you%20can%20refer%20the%20link%20%3CA%20href%3D%22https%3A%2F%2Fazure.microsoft.com%2Fen-us%2Fdocumentation%2Farticles%2Fstorage-scalability-targets%2F%22%20target%3D%22_self%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3Ehere%3C%2FA%3E.%26nbsp%3B%26nbsp%3B%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EYou%20can%20also%20investigate%20the%20issue%20from%20network%20side.%20Having%20a%20good%20VM%20configuration%20is%20not%20enough%20if%20you%20are%20having%20low%20bandwidth%20network.%20The%20operations%20performed%20on%20storage%20account%20with%20low%20bandwidth%20network%20will%20have%20latency%20issue.%20Thus%2C%20we%20need%20to%20investigate%20if%20there%20is%20low%20network%20bandwidth%20which%20is%20causing%20high%20E2E%20latency.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EFor%20this%2C%20you%20can%20make%20use%20of%20various%20tools%20like%20PSPing%2C%20TCPing%20or%20simply%20take%20a%20traceroute%20from%20your%20client%20machine%20to%20the%20storage%20account.%20Additionally%2C%20you%20can%20also%20make%20use%20of%20the%20AzCopy%20command%20to%20see%20the%20throughput%20you%20are%20getting%20while%20uploading%2Fdownloading%20the%20blob%20from%20Storage%20Account.%3C%2FP%3E%0A%3CP%3EYou%20can%20refer%20to%20the%20below%20steps%3A%3C%2FP%3E%0A%3COL%3E%0A%3CLI%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fsysinternals%2Fdownloads%2Fpsping%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3EPS%20Ping%3C%2FA%3E%3A%20It%20is%20a%20small%20utility%20that%20implements%20ping%20functionality.%20It%20can%20be%20used%20to%20measure%20latency%20and%20bandwidth.%20You%20can%20execute%20the%20command%20as%20below%3A%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CUL%3E%0A%3CLI%3Epsping%20-n%20300%20broderwus.blob.core.windows.net%3A443%20(if%20using%20SSL%20endpoint)%3C%2FLI%3E%0A%3CLI%3Epsping%20-n%20300%20broderwus.blob.core.windows.net%3A80%20(if%20using%20non-SSL%20endpoint)%26nbsp%3B%3CBR%20%2F%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%224.png%22%20style%3D%22width%3A%20999px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195664i8C869BBCA5127F25%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%224.png%22%20alt%3D%224.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FLI%3E%0A%3C%2FUL%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3COL%20start%3D%222%22%3E%0A%3CLI%3E%3CA%20href%3D%22https%3A%2F%2Felifulkerson.com%2Fprojects%2Ftcping.php%22%20target%3D%22_blank%22%20rel%3D%22nofollow%20noopener%20noreferrer%20noopener%20noreferrer%22%3ETCP%20ping%3A%3C%2FA%3E%20It%20is%20a%20console%20application%20that%20implements%20%E2%80%9Cping%E2%80%9D%20functionality.%20However%2C%20it%20works%20over%20the%20TCP%20port.%20%26nbsp%3BThis%20tool%20can%20also%20be%20used%20to%20measure%20bandwidth%20or%20latency%20issue.%20The%20command%20is%3A%26nbsp%3B%20%3CEM%3Etcping.exe%20%3CYOURSTORAGEACCOUNTNAME%3E.blob.core.windows.net.%3CBR%20%2F%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%225.png%22%20style%3D%22width%3A%20999px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195665i0693F1E3B6D86A3C%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%225.png%22%20alt%3D%225.png%22%20%2F%3E%3C%2FSPAN%3E%3CBR%20%2F%3E%3C%2FYOURSTORAGEACCOUNTNAME%3E%3C%2FEM%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%20class%3D%22lia-indent-padding-left-30px%22%3E3.%20Tracert%20command%3A%20You%20can%20get%20the%20trace%20route%20from%20your%20client%20machine%20to%20the%20storage%20account.%20You%20can%20execute%20the%20below%20command%20to%20get%20the%20details%3A%26nbsp%3B%20%3CBR%20%2F%3Etcping.exe%20%3CYOURSTORAGEACCOUNTNAME%3E.blob.core.windows.net.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FYOURSTORAGEACCOUNTNAME%3E%3C%2FP%3E%0A%3COL%20start%3D%224%22%3E%0A%3CLI%3E%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fstorage%2Fcommon%2Fstorage-use-azcopy-v10%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3EAzCopy%3C%2FA%3E%3A%20AzCopy%20is%20command%20line%20tool%20that%20helps%20in%20transferring%20data%20to%20the%20storage%20account.%20It%20can%20he%20also%20be%20used%20to%20see%20the%20throughput%20that%20you%20are%20getting%20on%20the%20given%20network.%20The%20command%20to%20upload%20data%20to%20storage%20account%20is%20as%20below%3A%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CUL%3E%0A%3CLI%3Eazcopy%20copy%20%22Your%20local%20file%20path%22%20%22https%3A%2F%2F%3CYOURSTORAGEACCOUNTNAME.BLOB.CORE.WINDOWS.NET%3E%0A%3C%2FYOURSTORAGEACCOUNTNAME.BLOB.CORE.WINDOWS.NET%3E%3C%2FLI%3E%3C%2FUL%3E%0A%3CP%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%226.png%22%20style%3D%22width%3A%20999px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195666i1AB19ABA43FB60DD%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%226.png%22%20alt%3D%226.png%22%20%2F%3E%3C%2FSPAN%3E%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3COL%20start%3D%225%22%3E%0A%3CLI%3E%3CA%20href%3D%22https%3A%2F%2Fazure.microsoft.com%2Fen-us%2Ffeatures%2Fstorage-explorer%2F%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%22%3EStorage%20Explorer%3C%2FA%3E%3A%20It%20a%20tool%20that%20can%20be%20used%20to%20perform%20activities%20on%20the%20storage%20account%20like%20uploading%20blob%2C%20downloading%20blobs%2C%20or%20assigning%20ACL%20permissions%20etc.%20The%20tool%20internally%20calls%20AzCopy%20command.%3CBR%20%2F%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20image-alt%3D%227.png%22%20style%3D%22width%3A%20999px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F195667i43AD4439AD93BB93%2Fimage-size%2Flarge%3Fv%3D1.0%26amp%3Bpx%3D999%22%20title%3D%227.png%22%20alt%3D%227.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FLI%3E%0A%3C%2FOL%3E%0A%3CP%3EIf%20you%20observe%20any%20issue%20with%20network%2C%20then%20you%20will%20have%20to%20consider%20moving%20your%20application%20closer%20to%20region%20where%20the%20Storage%20Account%20is%20hosted.%20Let%20say%20that%20your%20application%20is%20hosted%20on%20Azure%20VM%20in%20Central%20US%20and%20the%20storage%20account%20it%20is%20accessing%20is%20in%20East%20US.%20To%20have%20better%20network%20performance%2C%20you%20will%20have%20to%20consider%20placing%20your%20application%20in%20the%20same%20region%20as%20that%20of%20storage%20account%20to%20avoid%20the%20network%20latency.%3CBR%20%2F%3E%3CBR%20%2F%3E%3C%2FP%3E%0A%3CP%3EHope%20this%20article%20helps%20you%20in%20isolating%20the%20latency%20issue%20and%20helps%20you%20in%20taking%20appropriate%20steps%20to%20mitigate%20it.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-TEASER%20id%3D%22lingo-teaser-1430656%22%20slang%3D%22en-US%22%3E%3CP%3EA%20walkthrough%20on%20how%20to%20isolate%20the%20latency%20issue%20for%20the%20Azure%20Storage%20Account%3C%2FP%3E%3C%2FLINGO-TEASER%3E%3CLINGO-LABS%20id%3D%22lingo-labs-1430656%22%20slang%3D%22en-US%22%3E%3CLINGO-LABEL%3EAzure%20Storage%3C%2FLINGO-LABEL%3E%3C%2FLINGO-LABS%3E
Microsoft

Scenario:

Let’s say, your application makes use of Azure Storage Account for uploading/downloading blobs. However, you are seeing that there is latency in either uploading the blob or downloading it. Though, there has been no change in your application, you are unable to figure out what is causing the issue. You are unsure of whether the latency is part of the application or from Azure Storage Account.

Now, let’s see how we can isolate the latency issue and understand whether the latency is from Storage account or Client Application.

Before, we understand how to troubleshoot the latency issue, let us understand what Server Latency and E2E Latency is:

  • End-to-end (E2E) latency : It measures the interval from when Azure Storage Server receives the first packet of the request until Azure Storage Server receives a client acknowledgment on the last packet of the response. In simpler terms it means the round trip of any operation starting at the client application, plus the time taken for processing the request at Storage Server and then coming back to the client application.
  • Server latency: It measures the interval from when Azure Storage Server receives the last packet of the request until the first packet of the response is returned from Azure Storage Server. In simpler terms, it means the time taken by the Storage Sever to process any given request.

The first starting point to identify any latency issue, is to check the metrices available on the Portal for the storage account. Navigate to the storage account -> Go to the Metrices tab -> Select the metric values depending on which Storage Service you are using. For example, I am making use of the Blob Service for selecting the metric values:
1.png

 

In usual scenarios, there should be less difference between the Sever Latency and E2E Latency. If you see that the E2E latency is significantly higher than the Server Latency, then it needs investigation as to why there is client-side latency.

If you have the Storage Analytical Logging enabled on the storage account, we can make use of these logs to find whether there is server latency or E2E latency. You can know more about the Storage Analytical Logging by referring to this link. The storage analytical logs are saved in the $logs container and can be accessed using the Storage Explorer tool. Please refer to the below screenshot for reference:

2.png

Now, in the storage analytical logs, we can see the E2E latency and Sever latency value for each operation done on the storage account. A log sample is as below:
3.png

 

In the logs, we can see that there is high E2E latency as compared to the Server Latency.

Thus, by making use of the metrices available on Portal or Storage Analytical Logs, we can understand whether the latency seen is from the client-side or from the Storage Server side.

Let us assume that there is high E2E latency and low Server Latency. Now, how are we going to isolate this issue further i.e. how to understand if the latency is because of the client application/VM’s performance or/and there is issue with the network.

The common attributes that can be checked to see if there is performance issue with the client application/VM are:

  1. High CPU on the client: When you are running the application, check if there is spike in the CPU of the client machine. If the CPU is higher than expected, it will result in performance issue for the application and in turn increase the latency value.
  2. Low available memory on the client: Check if you are running low on memory on the client machine.
  3. Running out of network bandwidth on the client.
  4. Misconfigured client application.

If you observe any of the above parameter while running your application, you will have to consider increasing your VM’s size. Try to scale your VM and check if there is any latency issue while uploading/downing blob to storage account.

If you find any performance issues with the application, you can refer to the performance checklist mentioned in the link. Note that the checklist is for the .NET applications.

You can also face the latency issue caused due to the Azure DC region from your application's location.  For isolating the Azure DC latency issue, you can follow the below steps:

  1. Test using Azure Speed meter given in the link: http://www.azurespeed.com/Azure/Latency
    Select the Azure Data center that you want to access, and it will show the latency for it. This will help in isolating the latency issue caused due to the Azure DC region from your application's location.
  2. Test using different container/blobs:
    a. Create a new container (don't have the same naming convention than the existing containers) in the same storage account and add a new blob to that container and test accessing that blob.  This will help eliminate any storage level partitioning issues.
    b. Create a blob in your storage account and attempt to access that blob instead of your blob in that storage account
  3. Test using the Azure Storage team’s reference samples given in the link here.
  4. Create a hosted service in the datacenter where the storage account is and then RDP onto the VM and test throughput from there. Test just using IE to browse to the blob, and the tools from above.  You will most likely achieve very fast downloads which will indicate that the problem is the client’s network connection or application, and not Azure.
  5. Make sure that you are aware of the performance targets and the throughput offered in storage account. For this you can refer the link here.  

 

You can also investigate the issue from network side. Having a good VM configuration is not enough if you are having low bandwidth network. The operations performed on storage account with low bandwidth network will have latency issue. Thus, we need to investigate if there is low network bandwidth which is causing high E2E latency.

For this, you can make use of various tools like PSPing, TCPing or simply take a traceroute from your client machine to the storage account. Additionally, you can also make use of the AzCopy command to see the throughput you are getting while uploading/downloading the blob from Storage Account.

You can refer to the below steps:

  1. PS Ping: It is a small utility that implements ping functionality. It can be used to measure latency and bandwidth. You can execute the command as below:
  • psping -n 300 broderwus.blob.core.windows.net:443 (if using SSL endpoint)
  • psping -n 300 broderwus.blob.core.windows.net:80 (if using non-SSL endpoint) 
    4.png

 

  1. TCP ping: It is a console application that implements “ping” functionality. However, it works over the TCP port.  This tool can also be used to measure bandwidth or latency issue. The command is:  tcping.exe <yourstorageaccountname>.blob.core.windows.net.
    5.png

3. Tracert command: You can get the trace route from your client machine to the storage account. You can execute the below command to get the details: 
tcping.exe <yourstorageaccountname>.blob.core.windows.net.

  1. AzCopy: AzCopy is command line tool that helps in transferring data to the storage account. It can he also be used to see the throughput that you are getting on the given network. The command to upload data to storage account is as below:
  • azcopy copy "Your local file path" "https://<yourstorageaccountname.blob.core.windows.net/test/SAS_token" --recursive=true 

6.png

  1. Storage Explorer: It a tool that can be used to perform activities on the storage account like uploading blob, downloading blobs, or assigning ACL permissions etc. The tool internally calls AzCopy command.
    7.png

If you observe any issue with network, then you will have to consider moving your application closer to region where the Storage Account is hosted. Let say that your application is hosted on Azure VM in Central US and the storage account it is accessing is in East US. To have better network performance, you will have to consider placing your application in the same region as that of storage account to avoid the network latency.

Hope this article helps you in isolating the latency issue and helps you in taking appropriate steps to mitigate it.