AnsweredAssumed Answered

Error after upgrading Spark to 2.0.1

Question asked by sagar.sonawane on Apr 6, 2017
Latest reply on Apr 12, 2017 by aalvarez

I have followed Post-Upgrade Steps for Spark after upgrading spark without installer. There was step entioned to put all jars in zip on maprfs, which i did. But, then I am getting 100s of errors like below continuously:

 

Exception encountered when attempting to load application log maprfs:///apps/spark/spark-jars.zip java.lang.IllegalArgumentException: Codec [zip] is not available. Consider setting spark.io.compression.codec=snappy at org.apache.spark.io.CompressionCodec$$anonfun$createCodec$1.apply(CompressionCodec.scala:78) at org.apache.spark.io.CompressionCodec$$anonfun$createCodec$1.apply(CompressionCodec.scala:78) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.io.CompressionCodec$.createCodec(CompressionCodec.scala:78) at org.apache.spark.scheduler.EventLoggingListener$$anonfun$6$$anonfun$apply$1.apply(EventLoggingListener.scala:317) at org.apache.spark.scheduler.EventLoggingListener$$anonfun$6$$anonfun$apply$1.apply(EventLoggingListener.scala:317) at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:194) at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:80) at org.apache.spark.scheduler.EventLoggingListener$$anonfun$6.apply(EventLoggingListener.scala:317) at org.apache.spark.scheduler.EventLoggingListener$$anonfun$6.apply(EventLoggingListener.scala:316) at scala.Option.map(Option.scala:146) at org.apache.spark.scheduler.EventLoggingListener$.openEventLog(EventLoggingListener.scala:316) at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$replay(FsHistoryProvider.scala:563) at org.apache.spark.deploy.history.FsHistoryProvider.org$apache$spark$deploy$history$FsHistoryProvider$$mergeApplicationListing(FsHistoryProvider.scala:398) at org.apache.spark.deploy.history.FsHistoryProvider$$anonfun$checkForLogs$3$$anon$4.run(FsHistoryProvider.scala:310) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745)

 

I have captured these errors using MapR Monitoring in Kibana console. I did added "spark.io.compression.codec     snaapy" entry in spark-defaults.conf. But still I am getting above errors.

 

Any help please.

Outcomes