AnsweredAssumed Answered

Java Heap Space and Hive Jobs

Question asked by mandoskippy on Aug 27, 2012
Latest reply on Aug 30, 2012 by gera
While I am still getting to know java, hadoop and hive, one thing that had always plagued me was Java Heap Space errors on my previous distro of hadoop.  I do have some small nodes in a test cluster. (2 proc, 1 physical drive dedicated to mapr, 6 GB of ram VMs) and when I switched to MapR I noticed that Java Heap Space issues were largely a thing of the past. I was ecstatic. I don't know if MapR does something differently or, but it was heaven, and thus I never did more research into the subject.

Fast forward to today, I am doing some large hive ETL jobs, and I am starting to get that Java Heap Space again. I do have some larger rows in my data, but I was wondering if there is a good way to troubleshoot this with MapR, what settings should I look at playing with, and where are they located (i.e. what's the best practice in changing those settings for my cluster) in MapR.