Query Big tabular data

Occasional Visitor

Hi,

I have a tabular data of over 6000 columns and millions of rows (over 500 MB, mostly numbers). I need to read and write on the table continuously from 2 different sources (one is writing, other reading). Also I want to be able to filter and query data, based on rows and/or columns and it should be very fast and cheap. I need to make a query every 2min all day, every day and I am using R language to connect. I want to know what is the best option in Azure to deal with this type of data? What technology should I use? 

0 Replies