AnsweredAssumed Answered

Unable to run MapReduce jobs due to ClassNotFoundException and jar file issue

Question asked by pvharish on Nov 7, 2014
Latest reply on Nov 19, 2014 by pvharish
<p>When running map-reduce jobs there is an issue that hadoop is unable to find a method in the jar which i am using in my program.</p> <p>jackson-all-1.9.11.jar &amp; jackson-core-2.1.1.jar(files which am using ) jackson-core-asl-1.8.8.jar(file pointing in hadoop classpath)</p> <p>I exported my jar file to Hadoop classpath using: export HADOOP_CLASSPATH=jackson-all-1.9.11.jar:$HADOOPCLASSPATH. </p> <p>But it didnt solve the issue. Is the issue because hadoop is pointing jar file from its filesystem first in the classpath? If so from which file is hadoop getting its classpath? Is there any way that I can move the jar which i am using to first of the classpath to the jars that hadoop is pointing?</p> <p>And there is also a ClassDefNotFound exception for not finding HTableInterface.Class . But I have listed the related jar (hbase-0.94.21-mapr-1409.jar) in my classpath and also set all the jars required by using:</p> <p>DistributedCache.addFileToClassPath(new Path("/user/mapr/jars/hbase-0.94.21-mapr-1409.jar"), conf);</p> <p>So what might be the issue and how to resolve it? </p> <p>Thankyou,</p> <p>Hareesh.</p>