AnsweredAssumed Answered

Hive Package upgrade / HIVE_KRYO_BUFFER_SIZE does not exist

Question asked by dodoman on Sep 16, 2016
Latest reply on Jan 30, 2017 by dodoman

Hello there,


we installed a MapR Cluster 5.1 at May 2016 on a customers cluster.


After some months they were running the first time in a buffer underflow error.

I searched for patches and found the possibility of upgrading hive for beeing able to increade the buffer size:


Hive 1.2.1-1605 Release Notes

The problem here is, the cluster version of May 2016 is not compatible with the newest hive version (201608 Build)

The customer will not take the risk for upgrading the whole cluster to newer packages (RHEL) after we run into the issue of upgrading hive to a newer build of the same version.


Here is the error we are getting:


Task ID:



Task ID:

Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
        at org.apache.hadoop.util.ReflectionUtils.newInstance(
        at org.apache.hadoop.mapred.ReduceTask.runOldReducer(
        at org.apache.hadoop.mapred.YarnChild$
        at Method)
        at org.apache.hadoop.mapred.YarnChild.main(
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
        at java.lang.reflect.Constructor.newInstance(
        at org.apache.hadoop.util.ReflectionUtils.newInstance(
        ... 7 more
Caused by: java.lang.NoSuchFieldError: HIVE_KRYO_BUFFER_SIZE
        at org.apache.hadoop.hive.ql.exec.Utilities.<clinit>(
        ... 12 more



If you google for HIVE_KRYO_BUFFER_SIZE there is no entry about this field. Where does it come from?



Additional Setup information:


MapR 5.1

Version of May 2016

Hive 1.2 upgraded to Hive 1.2 of Aug 2016


Without the upgrade we run into a buffer underflow

With the upgrade we get the error above.

The error happens from inside of hue, hive -e "Query" and jdbc queries.