AnsweredAssumed Answered

ClassNotFoundException: TableSplit not found

Question asked by endamcm on Feb 15, 2018
Latest reply on Feb 15, 2018 by MichaelSegel

I have deployed a Spark Job to a MAPR 5.2.1 environment.

I have packaged the following jars using Maven

        <dependency>
            <groupId>com.mapr.hadoop</groupId>
            <artifactId>maprfs</artifactId>
            <version>5.2.0-mapr</version>
        </dependency>
        
        <dependency>
            <groupId>com.mapr.fs</groupId>
            <artifactId>mapr-hbase</artifactId>
            <version>5.2.0-mapr</version>
        </dependency>
        
        <dependency>
              <groupId>org.apache.hbase</groupId>
              <artifactId>hbase</artifactId>
              <version>0.94.5-mapr</version>
        </dependency>

 

        <dependency>
                <groupId>org.apache.hbase</groupId>
                <artifactId>hbase-server</artifactId>
                <version>1.1.1-mapr-1602</version>
        </dependency>

 

 

But I receive the following error when running the Spark Job on the Mapr Cluster.

 

Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.hbase.mapreduce.TableSplit not found
    at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2126)
    at org.apache.hadoop.io.ObjectWritable.loadClass(ObjectWritable.java:373)

 

Any thoughts what is causing this issue?

Outcomes