Query Big tabular data

%3CLINGO-SUB%20id%3D%22lingo-sub-120984%22%20slang%3D%22en-US%22%3EQuery%20Big%20tabular%20data%3C%2FLINGO-SUB%3E%3CLINGO-BODY%20id%3D%22lingo-body-120984%22%20slang%3D%22en-US%22%3E%3CP%3EHi%2C%3C%2FP%3E%3CP%3EI%20have%20a%20tabular%20data%20of%20over%206000%20columns%20and%20millions%20of%20rows%20(over%20500%20MB%2C%20mostly%20numbers).%20I%20need%20to%20read%20and%20write%20on%20the%20table%20continuously%20from%202%20different%20sources%20(one%20is%20writing%2C%20other%20reading).%20Also%20I%20want%20to%20be%20able%20to%20filter%20and%20query%20data%2C%20based%20on%20rows%20and%2For%20columns%20and%20it%20should%20be%20very%20fast%20and%20cheap.%20I%20need%20to%20make%20a%20query%20every%202min%20all%20day%2C%20every%20day%20and%20I%20am%20using%20R%20language%20to%20connect.%26nbsp%3BI%20want%20to%20know%20what%20is%20the%20best%20option%20in%20Azure%20to%20deal%20with%20this%20type%20of%20data%3F%20What%20technology%20should%20I%20use%3F%26nbsp%3B%3C%2FP%3E%3C%2FLINGO-BODY%3E%3CLINGO-LABS%20id%3D%22lingo-labs-120984%22%20slang%3D%22en-US%22%3E%3CLINGO-LABEL%3EBig%20Data%20%26amp%3B%20Analytics%3C%2FLINGO-LABEL%3E%3CLINGO-LABEL%3EData%20%26amp%3B%20Storage%3C%2FLINGO-LABEL%3E%3C%2FLINGO-LABS%3E
Occasional Visitor

Hi,

I have a tabular data of over 6000 columns and millions of rows (over 500 MB, mostly numbers). I need to read and write on the table continuously from 2 different sources (one is writing, other reading). Also I want to be able to filter and query data, based on rows and/or columns and it should be very fast and cheap. I need to make a query every 2min all day, every day and I am using R language to connect. I want to know what is the best option in Azure to deal with this type of data? What technology should I use? 

0 Replies