AnsweredAssumed Answered

BaseTableMappingRules not found Error in Spark when accessing MapR-DB

Question asked by danielsobrado on Dec 26, 2016
Latest reply on Jan 10, 2017 by danielsobrado

I'm starting Spark (1.6.1) MapR 5.1 from the command line REPL (spark-shell) with:

 

./spark-shell --jars /opt/mapr/lib/hadoop-common-2.7.0.jar,/opt/mapr/lib/maprfs-5.1.0-mapr.jar,/opt/mapr/lib/maprdb-5.1.0-mapr.jar,/opt/mapr/lib/maprdb-mapreduce-5.1.0-mapr.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-client-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-server-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-hadoop-compat-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-common-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-hadoop2-compat-1.1.1-mapr-1602.jar,/opt/mapr/lib/mapr-hbase-5.1.0-mapr.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-common-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-client-1.1.1-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprdb-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-annotations-2.7.0-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-protocol-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-procedure-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-hadoop-compat-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-hadoop2-compat-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-prefix-tree-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-it-1.1.1-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/central-logging-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-hdfs-2.7.0-mapr-1602-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprfs-diagnostic-tools-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-annotations-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-hdfs-nfs-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/mapr-hbase-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-auth-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprdb-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/mapr-hbase-5.1.0-mapr-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-aws-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprdb-5.1.0-mapr-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/mapr-java-utils-5.1.0-mapr-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-azure-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprdb-mapreduce-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/mapr-tools-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-common-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprdb-mapreduce-5.1.0-mapr-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/mapr-tools-5.1.0-mapr-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-common-2.7.0-mapr-1602-tests.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprdb-shell-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/ojai-mapreduce-1.0.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/hadoop-hdfs-2.7.0-mapr-1602.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/maprfs-5.1.0-mapr.jar,/opt/mapr/hadoop/hadoop-0.20.2/lib/zookeeper-3.4.5-mapr-1503.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-annotations-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-annotations-1.1.1-mapr-1602-tests.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-client-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-common-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-common-1.1.1-mapr-1602-tests.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-examples-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-hadoop2-compat-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-hadoop-compat-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-it-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-it-1.1.1-mapr-1602-tests.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-prefix-tree-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-procedure-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-protocol-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-rest-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-server-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-server-1.1.1-mapr-1602-tests.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-shell-1.1.1-mapr-1602.jar,/opt/mapr/hbase/hbase-1.1.1/lib/hbase-thrift-1.1.1-mapr-1602.jar

 

My code is very simple:

 

import org.apache.spark._
import org.apache.spark.rdd.NewHadoopRDD
import org.apache.hadoop.hbase.{HBaseConfiguration, HTableDescriptor}
import org.apache.hadoop.hbase.client.HBaseAdmin
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HColumnDescriptor
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.HTable;
import org.apache.hadoop.hbase.mapred.TableOutputFormat
import org.apache.hadoop.mapred.JobConf
import org.apache.hadoop.hbase.io.ImmutableBytesWritable
import org.apache.hadoop.mapreduce.Job
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat
import org.apache.hadoop.hbase.KeyValue
import org.apache.hadoop.hbase.mapreduce.HFileOutputFormat
import org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles

 

val conf = HBaseConfiguration.create()
conf.set("hadoop.spoofed.user.uid", "5000");
conf.set("hadoop.spoofed.user.gid", "5000");
conf.set("hadoop.spoofed.user.username", "mapr");

 

val tableName = "/scheme/Table"

 

conf.addResource(new Path("file:///opt/mapr/hbase/hbase-1.1.1/conf/hbase-site.xml"))

 

val table = new HTable(conf, tableName)

I'm getting the following error:

 

java.io.IOException: java.lang.RuntimeException: Error occurred while instantiating com.mapr.fs.hbase.MapRTableMappingRules.
==> org/apache/hadoop/hbase/client/mapr/BaseTableMappingRules.
at org.apache.hadoop.hbase.client.mapr.TableMappingRulesFactory.create(TableMappingRulesFactory.java:68)
at org.apache.hadoop.hbase.client.HTable.initIfMapRTableImpl(HTable.java:475)
at org.apache.hadoop.hbase.client.HTable.initIfMapRTable(HTable.java:443)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:161)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:102)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:107)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:109)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:111)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:113)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:115)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:117)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:119)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:121)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:123)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:125)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:127)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:129)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:131)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:133)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:135)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:137)
at $iwC$$iwC$$iwC.<init>(<console>:139)
at $iwC$$iwC.<init>(<console>:141)
at $iwC.<init>(<console>:143)
at <init>(<console>:145)
at .<init>(<console>:149)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:752)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Error occurred while instantiating com.mapr.fs.hbase.MapRTableMappingRules.
==> org/apache/hadoop/hbase/client/mapr/BaseTableMappingRules.
at org.apache.hadoop.hbase.client.mapr.GenericHFactory.getImplementorInstance(GenericHFactory.java:40)
at org.apache.hadoop.hbase.client.mapr.TableMappingRulesFactory.create(TableMappingRulesFactory.java:50)
... 62 more
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/client/mapr/BaseTableMappingRules
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.hbase.client.mapr.GenericHFactory.getImplementorInstance(GenericHFactory.java:30)
... 63 more
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.client.mapr.BaseTableMappingRules
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 79 more

Outcomes