Apr 05 2024 04:20 PM
I have an azure function app(hosted with consumption plan) which is a timer trigger app that talks to aws sqs for events, reads data from aws s3 bucket and ingests that into log analytics custom tables.
What I noticed during a performance run is that, when events/second increases to say 2500+ events/sec, (anything lower than that works just fine), the function app struggles to keep up thus leading to events accumulating in s3 bucket.
The function app is simple in its design,
From logs ingestion point of view, time taken for ingestion time transformations is almost negligible.
I can think of going to premium plan, but I want to know if this is the max that consumption plan can achieve?
Any ideas how to improve this at scale?
Apr 07 2024 11:29 PM
Apr 08 2024 08:59 AM
@Kugan Nadaraja Thank you for your response.