AnsweredAssumed Answered

Spark from Apache Downloads Site for MapR

Question asked by mandoskippy on May 19, 2015
Latest reply on Jun 16, 2015 by mandoskippy
I downloaded the Spark 1.3.1 bin for mapr and when I try to run it a simple hive context query, I get the errors below. (looks like it can't get the mapr client running).

That said, I used the same conf files (updating the versions where needed) and process to run the spark 1.2.0 mapr bin from Apache website and the same query works fine.  Any thoughts?




15/05/19 09:31:26 INFO MemoryStore: MemoryStore started with capacity 1060.3 MB
java.lang.NullPointerException
at com.mapr.fs.ShimLoader.getRootClassLoader(ShimLoader.java:96)
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:232)
at com.mapr.fs.ShimLoader.load(ShimLoader.java:194)
at org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:60)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1847)
at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2062)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2272)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2224)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2141)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:992)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:966)
at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:98)
at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:43)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:220)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:186)
at org.apache.spark.executor.MesosExecutorBackend.registered(MesosExecutorBackend.scala:70)
java.lang.RuntimeException: Failure loading MapRClient.
at com.mapr.fs.ShimLoader.injectNativeLoader(ShimLoader.java:283)
at com.mapr.fs.ShimLoader.load(ShimLoader.java:194)
at org.apache.hadoop.conf.CoreDefaultProperties.<clinit>(CoreDefaultProperties.java:60)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1847)
at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2062)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2272)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2224)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2141)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:992)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:966)
at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopUtil.scala:98)
at org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:43)
at org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:220)
at org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala)
at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1959)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:104)
at org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:179)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:310)
at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:186)
at org.apache.spark.executor.MesosExecutorBackend.registered(MesosExecutorBackend.scala:70)

Outcomes