AnsweredAssumed Answered

Steps to override log4j.properties in Spark

Question asked by Vinayak Meghraj on May 20, 2016
Latest reply on May 20, 2016 by maprcommunity

Steps to override log4j.properties in Spark

 

1) Create a directory under user's home path and copy the following files from /opt/mapr/spark/spark-1.6.1/conf to new directory created

   log4j.properties

   spark-defaults.conf

   spark-env.sh

Note : In this example have copied files to path /home/mapr

 

2) Add the below entries in user's bash_profile

   export SPARK_HOME=/opt/mapr/spark/spark-1.6.1

   export SPARK_CONF_DIR=/home/mapr/

 

3) Run command "source ~/.bash_profile"

 

4) Run example to test the changes with different log options by updating property (log4j.rootCategory=WARN etc..) in log4j.properties.

/opt/mapr/spark/spark-1.6.1/bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 4g  --executor-memory 2g  --executor-cores 1  --queue thequeue/ /opt/mapr/spark/spark-1.6.1/lib/spark-examples*.jar 100;

Outcomes