AnsweredAssumed Answered

Spark 2.0.1 sample pi job failing

Question asked by bgajjela on Dec 31, 2016
Latest reply on Jan 10, 2017 by asukhenko

Hi,

 

Recently we installed spark 2.0.1 on new cluster . When validating spark 2.0.1 we are stuck with below issue.

 

 

$/opt/mapr/spark/spark-2.0.1/bin/run-example --master yarn --deploy-mode client SparkPi 10

 

 

16/12/31 19:15:32 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2278)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
16/12/31 19:15:33 WARN YarnSchedulerBackend$YarnSchedulerEndpoint: Attempted to request executors before the AM has registered!
16/12/31 19:15:33 WARN MetricsSystem: Stopping a MetricsSystem that is not running
Exception in thread "main" org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master.
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
        at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:149)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:497)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2278)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

 

-> when i checked the RM for container stderr log below is the output i found,

 

Error: Could not find or load main class org.apache.spark.deploy.yarn.ExecutorLauncher

--->I believe i followed all the steps in below spark configuration link.

Configure Spark JAR Location (Spark 2.0.1) 

Please let us know if Iam missing anything here Rachel Silver

 

 

---> Below are the rpm information

 

mapr-mapreduce2-2.7.0.39122.GA-1.x86_64
mapr-spark-2.0.1.201612011057-1.noarch
mapr-resourcemanager-2.7.0.39122.GA-1.x86_64
mapr-hadoop-core-2.7.0.39122.GA-1.x86_64
mapr-mapreduce1-0.20.2.39122.GA-1.x86_64
mapr-fileserver-5.2.0.39122.GA-1.x86_64
mapr-drill-1.9.0.201612011635-1.noarch
mapr-pig-0.16.201612051350-1.noarch
mapr-zk-internal-5.2.0.39122.GA-1.x86_64
mapr-historyserver-2.7.0.39122.GA-1.x86_64
mapr-nodemanager-2.7.0.39122.GA-1.x86_64
mapr-core-internal-5.2.0.39122.GA-1.x86_64
mapr-core-5.2.0.39122.GA-1.x86_64
mapr-hive-1.2.201611292220-1.noarch
mapr-zookeeper-5.2.0.39122.GA-1.x86_64

 

 

anything missing with spark-assembly jar ?

 

 

Thanks,

Bharath

 

 

 

Outcomes