So, I've got some a database in the MSSQL 2016 server. What is the best possible way to move the data to MapR and do my analysis?
Please see if the below links help you get started in that direction.
And this too - Best Practices on Migrating from a Data Warehouse to a Big Data Platform | MapR.
Thanks a lot MUFEED USMAN. In my case, my raw data is in MSSQL 2016 tables. I don't have SSIS installed yet. I've tried to export the data in json files divided by individual months but still have troubles exporting and analyzing from Tableau. I'm thinking of importing the json data back to hbase or hive tables so that Tableau can access it better. Is there any "how-to" or tutorials to import the json file to hbase or hive?
Just a couple of comments.
You don't have to bring the data over and import it in to JSON docs. JSON is just one way to represent the data as a hierarchical model. The issue is that Big Data isn't really relational. There's a cost when it comes to joining relational tables where you're providing data in a record format. So you can flatten the data on your own. In Hive and Pig, you have the concepts of bags where you can represent subsets within a row of data.
(Control-A and Control-B delimeters)
If you're storing the data in MapRDB, you could store a subset of data within a single cell because everything is a byte array.
Of course JSON also works well because more tools understand it. If you can export the records as JSON (Output of a select statement for example...) You can use Spark to read the file... build a data frame and use SparkSQL to transpose it as needed... Note that there are other tools available...
(Note: HBase doesn't understand JSON per se )
Retrieving data ...