Oct 25 2017
03:45 PM
- last edited on
Jul 31 2018
12:07 PM
by
TechCommunityAP
Oct 25 2017
03:45 PM
- last edited on
Jul 31 2018
12:07 PM
by
TechCommunityAP
Hi,
I have a tabular data of over 6000 columns and millions of rows (over 500 MB, mostly numbers). I need to read and write on the table continuously from 2 different sources (one is writing, other reading). Also I want to be able to filter and query data, based on rows and/or columns and it should be very fast and cheap. I need to make a query every 2min all day, every day and I am using R language to connect. I want to know what is the best option in Azure to deal with this type of data? What technology should I use?