AnsweredAssumed Answered

MapR DB Bulk Load - File not found exception

Question asked by shashi.vish on Jun 18, 2015

We have bulk load job for loading data into MapR DB. Initially we were using same piece of code for loading data into HBase which were working perfectly. As HBase bulk load API's are compatible with Hbase , so we ran same code for MapR DB just by providing different table name(as we following different convention for creating MapR db table). After running job , reducer reaches to 100% but after that it starts failing.

15/06/18 00:43:56 INFO mapreduce.Job:  map 100% reduce 95%
15/06/18 00:45:23 INFO mapreduce.Job:  map 100% reduce 96%
15/06/18 00:46:23 INFO mapreduce.Job:  map 100% reduce 97%
15/06/18 00:46:38 INFO mapreduce.Job:  map 100% reduce 98%
15/06/18 01:12:59 INFO mapreduce.Job:  map 100% reduce 99%
15/06/18 02:09:56 INFO mapreduce.Job:  map 100% reduce 100%

15/06/18 02:38:51 INFO mapreduce.Job: Task Id : attempt_1434393685485_0021_r_000000_0, Status : FAILED
Error: maprfs:/user/hbase/mydatatable/_temporary/1/_temporary/attempt_1434393685485_0021_r_000000_0
        at com.mapr.fs.MapRFileSystem.listMapRStatus(
        at com.mapr.fs.MapRFileSystem.listStatus(
        at com.mapr.fs.MapRFileSystem.listStatus(
        at com.thinkbiganalytics.clickstream.hbase.Writer.close(
        at com.thinkbiganalytics.clickstream.hbase.MultiHFileRecordWriter.close(
        at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(
        at org.apache.hadoop.mapred.ReduceTask.runNewReducer(
        at org.apache.hadoop.mapred.YarnChild$
        at Method)
        at org.apache.hadoop.mapred.YarnChild.main(

In above it is looking for location '/user/hbase/mydatatable'... which is my table name not any directory.

Let me know someone has faced similar issue.How should i proceed here?