Alert "Monitor Condition" never changes

%3CLINGO-SUB%20id%3D%22lingo-sub-266207%22%20slang%3D%22en-US%22%3EAlert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-266207%22%20slang%3D%22en-US%22%3E%3CP%3EWe're%20starting%20our%20journey%20from%20SCOM%20to%20Azure%20Monitor%20and%20have%20run%20into%20an%20issue%20with%20Azure%20Alerts%20(sorry%20for%20posting%20this%20in%20Azure%20Log%20Analytics%2C%20but%20there%20is%20no%20Azure%20Monitor%20Tech%20Community).%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EI've%20noticed%20that%20when%20an%20Azure%20Alert%20is%20generated%2C%20that%20the%20%3CSTRONG%3EMonitor%20Condition%3C%2FSTRONG%3E%3CEM%3Enever%20changes%3C%2FEM%3Efrom%20%22Fired%22%20to%20%22Resolved%22.%26nbsp%3BAccording%20to%20the%20%3CA%20href%3D%22https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fmonitoring-and-diagnostics%2Fmonitoring-overview-unified-alerts%23alert-rules%22%20target%3D%22_blank%22%20rel%3D%22noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%22%3Edocumentation%3C%2FA%3E%2C%20the%26nbsp%3B%3CSTRONG%3EMonitor%20Condition%3C%2FSTRONG%3E%2C%20%22%3CSPAN%3EIndicates%20whether%20the%20condition%20that%20created%20a%20metric%20alert%20has%20been%20resolved.%20Metric%20alert%20rules%20sample%20a%20particular%20metric%20at%20regular%20intervals.%20If%20the%20criteria%20in%20the%20alert%20rule%20is%20met%2C%20then%20a%20new%20alert%20is%20created%20with%20a%20condition%20of%20%22fired.%22%20When%20the%20metric%20is%20sampled%20again%2C%20if%20the%20criteria%20is%20still%20met%2C%20then%20nothing%20happens.%20If%20the%20criteria%20is%20not%20met%2C%20then%20the%20condition%20of%20the%20alert%20is%20changed%20to%20%22resolved.%22%20The%20next%20time%20that%20the%20criteria%20is%20met%2C%20another%20alert%20is%20created%20with%20a%20condition%20of%20%22fired.%22%22%3CBR%20%2F%3E%3CBR%20%2F%3EDespite%20the%20condition%20no%20longer%20being%20met%20(for%20instance%2C%20a%20service%20down)%2C%20the%20Monitor%20Condition%20never%20changes.%20Am%20I%20missing%20something%3F%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-LABS%20id%3D%22lingo-labs-266207%22%20slang%3D%22en-US%22%3E%3CLINGO-LABEL%3EAlerts%3C%2FLINGO-LABEL%3E%3CLINGO-LABEL%3EAzure%20Log%20Analytics%3C%2FLINGO-LABEL%3E%3CLINGO-LABEL%3EAzure%20Monitor%3C%2FLINGO-LABEL%3E%3C%2FLINGO-LABS%3E%3CLINGO-SUB%20id%3D%22lingo-sub-266272%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-266272%22%20slang%3D%22en-US%22%3E%3CP%3EI%20read%20Vijay's%20response%20about%20query%20based%20alerts%2C%20and%20I%20don't%20understand%20the%20logic.%20Added%20another%20question%20to%20him%20on%20that%20Yammer%20thread.%3C%2FP%3E%0A%3CP%3EThanks%20for%20bringing%20that%20up.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-266218%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-266218%22%20slang%3D%22en-US%22%3E%3CP%3EDid%20see%20this%20in%20Yammer%3A%26nbsp%3B%3CA%20href%3D%22https%3A%2F%2Fwww.yammer.com%2Fazureadvisors%2F%23%2FThreads%2Fshow%3FthreadId%3D1090097679%22%20target%3D%22_blank%22%20rel%3D%22nofollow%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%22%3Ehttps%3A%2F%2Fwww.yammer.com%2Fazureadvisors%2F%23%2FThreads%2Fshow%3FthreadId%3D1090097679%3C%2FA%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-789979%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-789979%22%20slang%3D%22en-US%22%3E%3CP%3EDid%20anything%20ever%20come%20of%20this%3F%26nbsp%3B%20I'm%20seeing%20this%20behavior%20right%20now%20with%20V2%20(non-classic)%20log-search-based%20alerts.%26nbsp%3B%20I%20can't%20access%20the%20yammer%20thread%2C%20any%20info%20would%20be%20appreciated.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-790472%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-790472%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F47844%22%20target%3D%22_blank%22%3E%40Steven%20Whitney%3C%2FA%3E%26nbsp%3BSaw%20that%20they're%20planning%20to%20address%20this%20in%20a%20preview%20feature%20sometime%20this%20year%20(as%20of%20June).%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-790626%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-790626%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F151992%22%20target%3D%22_blank%22%3E%40Scott%20Allison%3C%2FA%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EThis%20is%20driving%20me%20crazy.%20I%20am%20doing%20everything%20to%20get%20a%20list%20of%20Acknowledged%20alerts%20so%20our%20jr%20staff%20can%20have%20insight.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-795135%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-795135%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F151992%22%20target%3D%22_blank%22%3E%40Scott%20Allison%3C%2FA%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EHi%20there%3C%2FP%3E%0A%3CP%3EI%20was%20out%20of%20the%20office%20for%20the%20last%206%20months%2C%20and%20don't%20know%20what%20the%20status%20is%20yet.%20While%20I'm%20trying%20to%20get%20an%20answer%2C%20you%20are%20also%20welcome%20to%20contact%20the%20product%20manager%20of%20this%20here%3A%20Yaniv.Lavi%40microsoft.com%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EThanks%20for%20bringing%20this%20up%20again%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-795851%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-795851%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F54923%22%20target%3D%22_blank%22%3E%40Noa%20Kuperberg%3C%2FA%3E%26nbsp%3BWelcome%20back!%20%3A)%3C%2Fimg%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-799860%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-799860%22%20slang%3D%22en-US%22%3E%3CP%3EThanks!%20so%20far%20the%20answer%20I%20have%20is%20that%20the%20alerts%20ownership%20is%20in%20transition%2C%20and%20we%20should%20ping%20again%20in%20a%20few%20weeks.%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-843527%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-843527%22%20slang%3D%22en-US%22%3E%3CP%3EHi%2C%3C%2FP%3E%0A%3CP%3EWanted%20to%20share%20that%20I've%20contacted%20the%20alert%20management%20PMs%20again%2C%20and%20this%20is%20a%20high%20priority%20on%20their%20list.%20They%20want%20to%20solve%20the%20issue%20holistically%20for%20all%20alerts%20types%2C%20as%20alert%20state%20management%20is%20identified%20as%20a%20source%20of%20issues%20with%20other%20alert%20types%20as%20well%20(not%20only%20log%20based).%3C%2FP%3E%0A%3CP%3EWe%20will%20keep%20updating%20when%20a%20clear%20timeline%20is%20available%20.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-872481%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-872481%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F54923%22%20target%3D%22_blank%22%3E%40Noa%20Kuperberg%3C%2FA%3E%26nbsp%3B%20do%20you%20know%20whether%20this%20same%20issue%20is%20likely%20affecting%20sentinel%20alerts%20also%26nbsp%3B%20%3F%3C%2FP%3E%3CP%3Ei%20was%20hoping%2Fexpecting%20incidents%20resolved%20in%20there%20to%20pass%20thru%20to%20MCAS%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-876709%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-876709%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F405932%22%20target%3D%22_blank%22%3E%40imrichard83%3C%2FA%3E%26nbsp%3B%20Sentinel%20alerts%20are%20different%2C%20they%20don't%20use%20Monitoring%20alerts%2C%20so%20should%20not%20be%20affected%20AFAIK.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1013909%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1013909%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F54923%22%20target%3D%22_blank%22%3E%40Noa%20Kuperberg%3C%2FA%3E%26nbsp%3BAny%20update%20on%20this%3F%20Alerts%20are%20pretty%20useless%20if%20they%20can%20only%20be%20fired%20once.%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EThanks%20in%20advance%2C%3C%2FP%3E%3CP%3EAnders%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1023363%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1023363%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F151992%22%20target%3D%22_blank%22%3E%40Scott%20Allison%3C%2FA%3E%26nbsp%3Bdid%20you%20find%20an%20answer%20on%20how%20to%20change%20the%20monitor%20condition%3F%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EThanks%20in%20advance%2C%3C%2FP%3E%3CP%3EAnders%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1028947%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1028947%22%20slang%3D%22en-US%22%3E%3CP%3EI've%20reached%20out%20to%20the%20alerts%20PM%20(Yaniv.Lavi%40microsoft.com)%2C%20awaiting%20his%20update%20on%20this.%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1029681%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1029681%22%20slang%3D%22en-US%22%3E%3CP%3EHi%20Noa.%26nbsp%3B%3C%2FP%3E%3CP%3EI%20posted%20a%20similar%20question%20on%20Stackoverflow%20and%20with%20a%20little%20help%20I%20found%20the%20reason%20of%20my%20issue.%20I%20changed%20the%20aggregation%20count%20from%20%22Count%22%20to%20%22Total%22%20and%20that%20resolved%20the%20alert.%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Fstackoverflow.com%2Fquestions%2F59009802%2Fazure-alert-only-fired-once%2F59014950%2359014950%22%20target%3D%22_blank%22%20rel%3D%22nofollow%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%20noopener%20noreferrer%22%3Ehttps%3A%2F%2Fstackoverflow.com%2Fquestions%2F59009802%2Fazure-alert-only-fired-once%2F59014950%2359014950%3C%2FA%3E%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1030088%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1030088%22%20slang%3D%22en-US%22%3E%3CP%3EThanks%26nbsp%3B%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F460242%22%20target%3D%22_blank%22%3E%40andersbaumann%3C%2FA%3E%26nbsp%3B!%3C%2FP%3E%0A%3CP%3EGreat%20to%20see%20you%20were%20able%20to%20adjust%20the%20query%20and%20find%20a%20solution.%20Harel's%20answer%20is%20relevant%20only%20to%20metric%20alerts%2C%20and%20the%20problem%20with%20resolving%20log%20based%20alerts%20is%20still%20on%20going.%3C%2FP%3E%0A%3CP%3EThe%20answer%20I%20got%20from%20Yaniv%20Lavi%20(%3CSPAN%3EYaniv.Lavi%40microsoft.com)%20is%20they're%20hoping%20to%20fix%20in%20in%20the%20next%20semester%2C%20but%20it's%20not%20sure%20yet.%3C%2FSPAN%3E%3C%2FP%3E%0A%3CDIV%3E%26nbsp%3B%3C%2FDIV%3E%0A%3CDIV%3ENoa%3C%2FDIV%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1064246%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1064246%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F54923%22%20target%3D%22_blank%22%3E%40Noa%20Kuperberg%3C%2FA%3E%26nbsp%3BHi%2C%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EWhat's%20the%20status%20of%20this%3F%20I%20am%20trying%20to%20configure%20alerts%20for%20failed%20automation%20runbooks%20and%20I%20am%20seeing%20the%20same%20behavior.%20The%20alert%20fires%20only%20once%2Fresource%2C%20and%20remains%20%22Fired%22.%20I%20tried%20both%20Total%20and%20Count%20aggregations%20with%20the%20same%20outcome.%26nbsp%3B%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3E%3CSPAN%20class%3D%22lia-inline-image-display-wrapper%20lia-image-align-inline%22%20style%3D%22width%3A%20400px%3B%22%3E%3CIMG%20src%3D%22https%3A%2F%2Fgxcuf89792.i.lithium.com%2Ft5%2Fimage%2Fserverpage%2Fimage-id%2F161608iB5F5F20FDFB37074%2Fimage-size%2Fmedium%3Fv%3D1.0%26amp%3Bpx%3D400%22%20alt%3D%22clipboard_image_0.png%22%20title%3D%22clipboard_image_0.png%22%20%2F%3E%3C%2FSPAN%3E%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3CP%3EBest%20regards%2C%3C%2FP%3E%3CP%3EAlex%3C%2FP%3E%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1078000%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1078000%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F487555%22%20target%3D%22_blank%22%3E%40alexm186%3C%2FA%3E%26nbsp%3BI'm%20dealing%20with%20the%20exact%20same%20issue%20as%20well%20-%20alerts%20are%20fired%20once%20but%20never%20again.%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1091169%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1091169%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F15847%22%20target%3D%22_blank%22%3E%40Alex%3C%2FA%3E%2C%20%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F397109%22%20target%3D%22_blank%22%3E%40Anthony_W%3C%2FA%3E%26nbsp%3B%3C%2FP%3E%0A%3CP%3EI%20am%20forwarding%20this%20to%20Yaniv%20(%3CSPAN%3EYaniv.Lavi%40microsoft.com)%20who%20owns%20log-based%20alerts%20for%20some%20updates.%3C%2FSPAN%3E%3C%2FP%3E%0A%3CP%3E%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1091188%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1091188%22%20slang%3D%22en-US%22%3EThanks%20I've%20raised%20a%20case%20and%20have%20been%20told%20it's%20a%20bug%20so%20waiting%20on%20a%20fix%20to%20be%20deployed%3C%2FLINGO-BODY%3E%3CLINGO-SUB%20id%3D%22lingo-sub-1095231%22%20slang%3D%22en-US%22%3ERe%3A%20Alert%20%22Monitor%20Condition%22%20never%20changes%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-1095231%22%20slang%3D%22en-US%22%3E%3CP%3E%3CA%20href%3D%22https%3A%2F%2Ftechcommunity.microsoft.com%2Ft5%2Fuser%2Fviewprofilepage%2Fuser-id%2F397109%22%20target%3D%22_blank%22%3E%40Anthony_W%3C%2FA%3E%26nbsp%3B%20hi%2C%20i%20am%20also%20facing%20the%20same%20issue%20for%20my%20project%2C%20would%20you%20mind%20share%20more%20details%20about%20the%20case%20you%20have%3F%20i%20am%20interested%20in%20the%20things%20like%20timeline%20and%20the%20team%20that%20responded%20%3A)%3C%2Fimg%3E%3C%2FP%3E%3C%2FLINGO-BODY%3E
Highlighted
Frequent Contributor

We're starting our journey from SCOM to Azure Monitor and have run into an issue with Azure Alerts (sorry for posting this in Azure Log Analytics, but there is no Azure Monitor Tech Community).

 

I've noticed that when an Azure Alert is generated, that the Monitor Condition never changes from "Fired" to "Resolved". According to the documentation, the Monitor Condition, "Indicates whether the condition that created a metric alert has been resolved. Metric alert rules sample a particular metric at regular intervals. If the criteria in the alert rule is met, then a new alert is created with a condition of "fired." When the metric is sampled again, if the criteria is still met, then nothing happens. If the criteria is not met, then the condition of the alert is changed to "resolved." The next time that the criteria is met, another alert is created with a condition of "fired.""

Despite the condition no longer being met (for instance, a service down), the Monitor Condition never changes. Am I missing something?

 

21 Replies
Highlighted
Highlighted

I read Vijay's response about query based alerts, and I don't understand the logic. Added another question to him on that Yammer thread.

Thanks for bringing that up.

Highlighted

Did anything ever come of this?  I'm seeing this behavior right now with V2 (non-classic) log-search-based alerts.  I can't access the yammer thread, any info would be appreciated.

Highlighted

@Steven Whitney Saw that they're planning to address this in a preview feature sometime this year (as of June). 

Highlighted

@Scott Allison 

 

This is driving me crazy. I am doing everything to get a list of Acknowledged alerts so our jr staff can have insight.

Highlighted

@Scott Allison 

Hi there

I was out of the office for the last 6 months, and don't know what the status is yet. While I'm trying to get an answer, you are also welcome to contact the product manager of this here: Yaniv.Lavi@microsoft.com

 

Thanks for bringing this up again

Highlighted

@Noa Kuperberg Welcome back! :) 

Highlighted

Thanks! so far the answer I have is that the alerts ownership is in transition, and we should ping again in a few weeks. 

Highlighted

Hi,

Wanted to share that I've contacted the alert management PMs again, and this is a high priority on their list. They want to solve the issue holistically for all alerts types, as alert state management is identified as a source of issues with other alert types as well (not only log based).

We will keep updating when a clear timeline is available .

Highlighted

@Noa Kuperberg  do you know whether this same issue is likely affecting sentinel alerts also  ?

i was hoping/expecting incidents resolved in there to pass thru to MCAS 

Highlighted

@imrichard83  Sentinel alerts are different, they don't use Monitoring alerts, so should not be affected AFAIK.

Highlighted

@Noa Kuperberg Any update on this? Alerts are pretty useless if they can only be fired once.

 

Thanks in advance,

Anders

Highlighted

@Scott Allison did you find an answer on how to change the monitor condition?

 

Thanks in advance,

Anders

Highlighted

I've reached out to the alerts PM (Yaniv.Lavi@microsoft.com), awaiting his update on this.

Highlighted

Hi Noa. 

I posted a similar question on Stackoverflow and with a little help I found the reason of my issue. I changed the aggregation count from "Count" to "Total" and that resolved the alert.

 

https://stackoverflow.com/questions/59009802/azure-alert-only-fired-once/59014950#59014950

Highlighted

Thanks @andersbaumann !

Great to see you were able to adjust the query and find a solution. Harel's answer is relevant only to metric alerts, and the problem with resolving log based alerts is still on going.

The answer I got from Yaniv Lavi (Yaniv.Lavi@microsoft.com) is they're hoping to fix in in the next semester, but it's not sure yet.

 
Noa
Highlighted

@Noa Kuperberg Hi,

 

What's the status of this? I am trying to configure alerts for failed automation runbooks and I am seeing the same behavior. The alert fires only once/resource, and remains "Fired". I tried both Total and Count aggregations with the same outcome. 

 

clipboard_image_0.png

 

Best regards,

Alex

 

Highlighted

@alexm186 I'm dealing with the exact same issue as well - alerts are fired once but never again. 

Highlighted

@Alex, @Anthony_W 

I am forwarding this to Yaniv (Yaniv.Lavi@microsoft.com) who owns log-based alerts for some updates.

 

Highlighted
Thanks I've raised a case and have been told it's a bug so waiting on a fix to be deployed
Highlighted

@Anthony_W  hi, i am also facing the same issue for my project, would you mind share more details about the case you have? i am interested in the things like timeline and the team that responded :)