Could you please help me out here..
I got a hive query which completes in 1hr 45min with hive-1.2, im trying to execute the same in spark with
driver memory 14g
even though i get the below error
I have tried to run the same with spark.yarn.executor.memoryOverhead 1600, even though it complains the same.
the yarn container size is 16g, not sure why its failing out at 9g.
I would appreciate if someone help me out here.