Timeseries resampling with Data Factory

Copper Contributor

Dear Community,
 
I am currently working on a data analytics project and have the challenge of handling time series data within azure.
 
We have several thousand sensors, which report their values to a control unit, which writes this data
in form of json files into an Azue Data Lake.

We're also considering the option of writing this data directly to a Database.
The sensors deliver their data in different intervals. But within our data platform, we wish to have one
uniform sampling rate over all sensors. Therefore we're required to upsample (interpolate) the signal if the frequency is too low or downsample (average) the signal if the frequency is too high.
We want to keep this as simple as possible and thought about starting a Data Factory Job, which performs this job every 5 minutes on the newly data which came into the source. The actual implementation can be in C#, python, R or even javascript.
 
From what I've learnt about Data Factory so far, there are different ways to do that:
1) Use HD Insights
I see this as pretty much work to setup and get familiar with it. Therefore looking for an easier option
 
2) Use a U-SQL Query with C#/Python or R
As far as I understand, this is only possible via Data Lake analytics, thus only a Data Lake is a possible input source, correct?
 
3) Create custom activity
This is only in Data Factory v2 available, which is currently in preview, not available in our favorite location and the integration into Visual Studio 2017 is almost 0. Moreover it is a pretty complex setup alltogehter with the batch processing.
 
 
My question now is:
Is there any other possible setup you would suggest I did not see? And if not, which of the solutions you would suggest?
 
 
Thank you in advance for any input on this.

0 Replies