Nov 02 2019 02:26 AM
Nov 02 2019 02:26 AM
Hello, i have developed a continuous running script/app that integrates two systems, pulling data out of one and posting into another. It is currently written in PowerShell, it has no UI, it just needs to run continuously and restart if it stops.
I now want to move it off my localhost and host it in Azure somewhere. I am looking for a server-less type hosting environment, and asking here for advice. I understand this could be an Azure Function, it could be Azure Automation, Containers, it could possible even be an Azure Logic App, i'm just unsure what is the best approach. I am also happy to port the code to a ore suitable language like nodejs if need be.
Nov 03 2019 12:36 PM
Hello @Andrew Huddleston,
As it is written in powershell, I would first take a look at Powershell Azure Functions. The first step is get your head around structure and gain some familiarity with the platform so I suggest running through Create your first PowerShell before diving in with yours.
In my opinion, continuous is not necessarily a good fit for functions though as I find they work best with a trigger of some sort (even a timer). In other words, get triggered, do some work and then complete. So maybe a rethink of your integration pattern might be a good idea. Continuous implies it has a constant steady stream of work which sounds like it might fit better with Data Factory.
The good news is you have something working now so you can use it while developing a cloud-based solution.
Nov 03 2019 12:49 PM
Hello @Jeffrey Chilberto
Thank you for this reply, this is very helpful. A little more context around the app, is that it is used to stream authentication logs from our IDP into Elasticsearch (equivalent of streaming the Azure Signin Logs from the Azure APIs to Elasticsearch). Each page of data it collects, has a "next url" link, to retrieve the next page of results. Because i am not using specific dates, any simply following the next url, the next url will ALWAYS exist and is infinite.
I could technically modify the script to pull a page of results and store the next url and stop. Then have a schedule trigger of "every minute" to pull the next stream of results starting from the "nexturl". This way it would act more like a start/stop on a recurring trigger. This might fit into the Azure Functions framework better.
Do you see any issue with that?
Nov 03 2019 01:50 PMSolution
Interesting scenario, thanks for sharing @Andrew Huddleston. Does your IDP ever run out of information? As in does the return url ever return an empty result? Is there a concern that if you change to every minute, more than one function could post the same information to the elasticsearch?
Depending on where you see this solution going, you might want to look at pushing this information into the Event Hub. Take this as a basis. Your situation is simpler now but the nice thing is once you are in the cloud you have more flexibility as to how to handle the information.
My suggestion though is to start with what you have and see if you can get it to run in Azure Functions. You will need something to start your function: http call, file drop or timer for example. Then once it is running you could try to just run continuously. My concern is after an unpredictable amount of time the function will stop without a clear reason why so you will need some mechanism to start it again. That is why I like the timer idea.
This is an interesting one though, and I for one would appreciate if you did a post on how you solve this. Cheers - Jeff
Nov 03 2019 02:00 PM