AnsweredAssumed Answered

Queries fail after Hive 0.11 to Hive 0.13 upgrade

Question asked by packetboy2000 on Apr 2, 2015
Latest reply on Apr 2, 2015 by Hao Zhu
Before upgrade all queries below worked fine...no post upgrade:

I *CAN* run this:

select ip,source,day
from hive_flow.pcaps_ip
where day between '2015-02-01' and '2015-02-15' and ip in ('1.2.3.4')

*and*

select ip,source,day
from hive_flow.pcaps_ip
where day between '2015-02-16' and '2015-02-28' and ip in ('1.2.3.4')

e.g. I can use a between time of up to about two weeks...however, as soon as I start expanding that, e.g. to like a month:


select ip,source,day
from hive_flow.pcaps_ip
where day between '2015-02-01' and '2015-02-28' and ip in ('1.2.3.4')

Then it fails:


    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.MapOperator: DESERIALIZE_ERRORS:0
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 finished. closing...
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.FilterOperator: 1 finished. closing...
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.FilterOperator: PASSED:0
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.FilterOperator: FILTERED:28496558
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: 2 finished. closing...
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: 3 finished. closing...
    2015-03-31 15:52:02,611 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Final Path: FS maprfs:/user/baldwinl/tmp/hive/hive_2015-03-31_15-50-44_684_6049043808217844963-92/_tmp.-ext-10001/000000_0
    2015-03-31 15:52:02,612 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: Writing to temp file: FS maprfs:/user/baldwinl/tmp/hive/hive_2015-03-31_15-50-44_684_6049043808217844963-92/_task_tmp.-ext-10001/_tmp.000000_0
    2015-03-31 15:52:02,612 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: New Final Path: FS maprfs:/user/baldwinl/tmp/hive/hive_2015-03-31_15-50-44_684_6049043808217844963-92/_tmp.-ext-10001/000000_0
    2015-03-31 15:52:02,616 INFO org.apache.hadoop.hive.ql.exec.FileSinkOperator: 3 Close done
    2015-03-31 15:52:02,616 INFO org.apache.hadoop.hive.ql.exec.SelectOperator: 2 Close done
    2015-03-31 15:52:02,616 INFO org.apache.hadoop.hive.ql.exec.FilterOperator: 1 Close done
    2015-03-31 15:52:02,616 INFO org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 Close done
    2015-03-31 15:52:02,616 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 4 Close done
    2015-03-31 15:52:02,616 INFO org.apache.hadoop.hive.ql.exec.mr.ExecMapper: ExecMapper: processed 28496558 rows: used memory = 207206312
    2015-03-31 15:52:02,633 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
    2015-03-31 15:52:02,648 WARN org.apache.hadoop.mapred.Child: Error running child
    java.io.IOException: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be cast to org.apache.hadoop.io.LongWritable
     at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
     at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
     at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:263)
     at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:178)
     at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:233)
     at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:218)
     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:48)
     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:444)
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:354)
     at org.apache.hadoop.mapred.Child$4.run(Child.java:278)
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java:415)
     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1566)
     at org.apache.hadoop.mapred.Child.main(Child.java:267)
    Caused by: java.io.IOException: java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be cast to org.apache.hadoop.io.LongWritable
     at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121)
     at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
     at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:356)
     at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:101)
     at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:41)
     at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:123)
     at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:261)
     ... 11 more
    Caused by: java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be cast to org.apache.hadoop.io.LongWritable
     at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$LongTreeReader.next(RecordReaderImpl.java:717)
     at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1788)
     at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2997)
     at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:153)
     at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:127)
     at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:351)
     ... 15 more
    2015-03-31 15:52:02,657 INFO org.apache.hadoop.mapred.Task: Runnning cleanup for the task


Thoughts?

Outcomes