AnsweredAssumed Answered

Not able to save dataframe to hive when i launch the application using spark submit

Question asked by Kashivishwanath on Jun 25, 2018
Latest reply on Jun 28, 2018 by vmeghraj

Hello All,

I wrote a simple Spark Streaming application in Scala which streams data from MapR topic, creates dataframe and saves the dataframe to Hive and MapR DB. When I execute this code from the spark-shell by copying it to spark-shell it is working absolutely fine i.e, able to save the dataframe to Hive and MapR DB. when I launch the application using spark-submit  I am able to save the dataframe to MapR-DB but unable to save to Hive, the only error I am getting is 


SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See for further details.
18/06/25 15:29:06 ERROR MapRFileSystem: Failed to delete path maprfs:/user/root/spark-warehouse/testtable123/_temporary-72fcc5b4-5b02-4f1c-84cf-695fcaa423f4, error: No such file or directory (2)
Time: 1529954950000 ms 


Below is my spark-submit command: 

./spark-submit --class org.hex.har.App --master yarn /root/harread/target/har-1.0-SNAPSHOT.jar 


Do I need to provide any additional hive parameters in the spark-submit? or do I need to add any path in the configuration files? Please help me in resolving the issue. Thank you in advance. 




I am using MapR-6 cluster, spark 2.1.0 and hive 2.1