As I get ready for the MCSD 1.6 exam this week, my concern about versioning differences between Spark releases increases. Here is what I mean.
The development of Spark is quite fast. I am wondering if the currently active Spark Developer 1.6 exam from MapR is going to be looking for answers that match version 1.6 of Spark or is the exam looking for answers that correspond to v 2.0.2, which is the latest as of this writing.
Here is an example of what I mean:
While this test is 1.6 and the material on MapR Academy for this test states sqlContext() is the starting point for all Spark SQL functionality; the new official Spark documentation for the latest version 2.0.2 has moved onto SparkSession().
Of course, there are many other changes. If someone is taking this exam in the next few days, should they be studying with Spark 1.6.0 documentation or with the latest Spark 2.0.2 documentation for this exam?
I hope you understand my concern. Thank you for your attention.