AnsweredAssumed Answered

install and run two different Spark versions on MapR 6

Question asked by rbukarev on Feb 27, 2018
Latest reply on Mar 1, 2018 by MichaelSegel

I'm running MapR 6, which has Spark 2.1.0 out of the box. Another component of my bigger ecosystem (Spark Controller for SAP BW on HANA) is only compatible with Spark 1.6.x. I think the main thing it requires is spark-assembly*.jar, which ceased to exist from Spark 2.x. Now I have two options.

1. Install Spark 1.6.x on the cluster.

2. Make up an "assembly" jar from libraries in the jars directory. Tbh I'm not sure it's gonna work. E.g. Spark Controller requires akka classes, and I don't see a respective jar in the jars directory.


So, on Spark 1.6.x installation. Do I understand it right that I can't just use a binary from Apache Spark downloads? If I can't use it, then how do I add mapr-spark-1.6 RPMs to the reporsitory -- and am I allowed?