I've got an event receiver for a library that fires upon ItemUpdated. It works fine for the most part, but I ran a mass upload script yesterday to the library (about 800 items), and the event receiver doesn't seem to fire all the time, or even some of them were firing, but doing incorrect updates (it appears some variables between different firing events got mixed up....don't even know how that's possible).
My question is, is there a recommended way to code these things so they are sure to fire? Do I need to beef up the power on my azure instance where the event receiver is hosted?
I depend on these things a lot, and previously have resorted to writing timer jobs to go an audit the lists where event receivers should be firing, but this doubles my workload anytime I want to use them. I'm open to suggestions!
AFAIK, thre is nothing you can do to ensure remote event receivers fire correctly....and to my knowledge, other options you could have here that are not an absolute guarantee for scenarios where there is a massive document uploads are: (1) Use Webhooks (2) Evaluate the use of Flow / Azure Logic Apps for your scenario