Would the excel data model (Office 365 64 bit environment) be able to handle a dataset with 11.5 million rows and 150 columns? (Note: none of these columns are calculations). Alternatively, could it handle one table with 11.5 million rows and 20 columns (2.5GB) and a related table with 3.7 million rows and 130 columns?
If you have an existing dataset within your data model which is driving a large amount of excel analysis, is there are way to seamlessly update the existing dataset with a newer version of that dataset (assume the new dataset has many more rows of data and also has quite a few new columns) such that the existing excel analysis (pivot tables and such) update accordingly?
1) From formal point of view you may consider data model as unlimited. More exactly, there are limits, but they are as big as 1,999,999,997 rows, 2,147,483,647 columns and about same number of tables. Actually you are limited by your memory and performance. There are techniques to optimize memory usage and performance, but that is not straightforward.
2) You have to load somehow your data into the data model. That could be Power Query and/or PowerPivot connectors. After that updating the data is just refreshing of the connections and Pivot Tables.