AnsweredAssumed Answered

ClassNotFoundException while building jar in spark submit

Question asked by karthikSpark on Dec 29, 2016
Latest reply on Jan 20, 2017 by maprcommunity

**Hi all,
I'm using SBT to build my project, and here is the structure of my project.**

HiveGenerator
├── build.sbt
├---lib
├── project
│ ├──
│ ├── assembly.sbt
│ └── plugins.sbt
├──
├──
└── src
└── main
└── scala
└── Main.scala


But i'm facing this error **"java.lang.ClassNotFoundException: package.classname"**, no matter how many times i build it.
I have used,
sbt clean package
sbt clean assembly,but with no luck.My class is always missing from the jar.

Here is my build.sbt

 

lazy val root = (project in file(".")).
settings(
name := "kafkaToMaprfs",
version := "1.0",
scalaVersion := "2.10.5",
mainClass in Compile := Some("classname")
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-hive_2.10" % "1.6.1",
"org.apache.spark" % "spark-core_2.10" % "1.6.1",
"org.apache.spark" % "spark-sql_2.10" % "1.6.1",
"com.databricks" % "spark-avro_2.10" % "2.0.1",
"org.apache.avro" % "avro" % "1.8.1",
"org.apache.avro" % "avro-mapred" % "1.8.1",
"org.apache.avro" % "avro-tools" % "1.8.1",
"org.apache.spark" % "spark-streaming_2.10" % "1.6.1",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1",
"org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13",
"org.openrdf.sesame" % "sesame-rio-api" % "2.7.2",
"log4j" % "log4j" % "1.2.17",
"com.twitter" % "bijection-avro_2.10" % "0.7.0"

)


mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
}

**Here is my assembly.sbt**

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")

**plugins.sbt**

addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "0.7.0")
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "OSS Sonatype" at "https://repo1.maven.org/maven2/"

However, im not able to build a fat jar or you can say jar-with-dependencies.jar like in maven.


In maven we have

<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>

Which helped me to accomplish this.

My question is,

1. why am i not building a jar with all the classes in it?

2.Which commands should i use to create a jar with dependencies in sbt?

3.To we have anything equivalent to **"descriptorRefs"** in sbt to do the magic?

Last question , which i didn't find answer to,

can't we achieve a proper output with sbt should we always use spark-submit to make it happen

(not considering local or cluster modes)?

Thanks in advance.

Outcomes