AnsweredAssumed Answered

ssl_keystore permission denied for Spark on YARN

Question asked by andylerner Employee on Jan 9, 2018
Latest reply on Jan 9, 2018 by maprcommunity

I've got a single node MapR 6.0 cluster with MEP 4.0 that was installed by the MapR installer.  When I run SparkPi using spark-submit:

 

SPARK_EXAMPLES_JAR=/opt/mapr/spark/spark-2.1.0/examples/jars/spark-examples_2.11-2.1.0-mapr-1710.jar

SPARK_HOME=/opt/mapr/spark/spark-2.1.0

$SPARK_HOME/bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --num-executors 2 --driver-memory 512m --executor-memory 512m --executor-cores 2 --queue default $SPARK_EXAMPLES_JAR 10

 

the job fails and in the YARN userlogs, I see this in the stderr file for an application attempt:

 

spark.ssl.keyStore=/opt/mapr/conf/ssl_keystore

spark.ssl.keyStorePassword=mapr123

...

2018-01-09 23:10:19,531 WARN  [Driver] component.AbstractLifeCycle: FAILED SslContextFactory@6e7d700e(/opt/mapr/conf/ssl_keystore,/opt/mapr/conf/ssl_keystore): java.io.FileNotFoundException: /opt/mapr/conf/ssl_keystore (Permission denied)

java.io.FileNotFoundException: /opt/mapr/conf/ssl_keystore (Permission denied)

 

I confirm with keytool -list that I can access the ssl_keystore with the mapr123 password. 

 

If I change the permissions for ssl_keystore from 400 to 444, the job succeeds.  Is there some configuration change needed to run spark jobs on YARN for Spark 2.1.0 (MEP 4.0)?

 

 

Outcomes