AnsweredAssumed Answered

Spark maprfs library

Question asked by nelson_verdier on May 23, 2014
Latest reply on May 23, 2014 by mandoskippy
Hi all,

I am trying to run spark over a MapR cluster. I successfully ran several custom applications on a previous non-mapr hadoop cluster but i can't get them working on the mapr one.
To be more specific, i am not able to read or write on mfs without running into a serialization error from Java.
Note that everything works fine when i am running the app in local mode.

The test application is built using sbt with the following dependencies:

 - org.apache.spark spark-core 0.9.1
 - org.apache.hadoop hadoop-core 2.3.0-mapr-4.0.0-beta (this way the app is aware of maprfs:///)

Am I compiling with the wrong dependencies? Should i get a "mapr version" of spark-core?

Regards,
Nelson

Outcomes