AnsweredAssumed Answered

How to Export a Hbase Table using Spark DataFrames in Mapr.

Question asked by ramakrishnaa14 on Nov 3, 2017
Latest reply on Nov 3, 2017 by cathy

I need to know whether we can create a dataframe on top of a Hbase table and then do some transformations and then store the output to folders.

 

I was trying below link

SparkSQL and DataFrames 

But Not sure whether i have to install some additional plugins since it trowing me an error like Dataframe not defined. 

 

I am getting below error when it try the example as mentioned in the link

scala> def withCatalog(cat: String): DataFrame = {
| sqlContext.read.options(Map(HBaseTableCatalog.tableCatalog->cat)).format("org.apache.hadoop.hbase.spark").load()}
<console>:37: error: not found: type DataFrame
def withCatalog(cat: String): DataFrame = {
^
<console>:38: error: not found: value HBaseTableCatalog
sqlContext.read.options(Map(HBaseTableCatalog.tableCatalog->cat)).format("org.apache.hadoop.hbase.spark").load()}

Outcomes