AnsweredAssumed Answered

Spark 2.0.0 Developer Preview hive issue

Question asked by santon on Aug 31, 2016
Latest reply on Oct 4, 2016 by santon


I'm trying to follow the steps in the Spark 2.0.0 Developer Preview, but am running into an error whenever I try to create a SQLContext.

 

org.apache.spark.SparkException: Unable to create database default as failed to create its directory maprfs:///hive/warehouse

 

The path /hive/warehouse is the correct location of where we store our hive tables, but I'm unclear why Spark is trying to recreate that directory. Of course, I don't have permission to create it so the job fails. Note that this happens even if I'm making a test DataFrame; it's not specific to accessing Hive functionality.

 

I've been at this for a couple days without much progress so I appreciate any tips. Thanks for your help!

-Steve

Outcomes