AnsweredAssumed Answered

Running MapReduce from jar file: Out of space

Question asked by fmdataservices on Aug 8, 2012
Latest reply on Aug 8, 2012 by gera
I am getting this error intermittently while running MapReduce from a jar file. Both the cluster as a whole and the client appear to have plenty of space available.

Version:
1.2.3.12961.GA and Hive 0.7.1

Running from a client machine using some additional parameters:

-Dmapred.reduce.tasks=30 -D in=/user/hive/warehouse/job_tz_hourly_2012080812_2012080817.db/userviewsbytimerangekeywords -D out=/user/datasrv/jobruns/tz-hourly/job-tz-hourly_2012080812_2012080817/mr-out-kw -D cache=/user/datasrv/jobruns/tz-hourly/job-tz-hourly_2012080812_2012080817/data-in/domain_regex/domain_regex.txt -D name=kw -files /opt/datasrv/qa/pageview_hadoop_aws/pageview-build/sharedobjects/ubuntu/libSemTech.so

----------

    Exception in thread "main" java.io.IOException: No space left on device
        at java.io.FileOutputStream.writeBytes(Native Method)
        at java.io.FileOutputStream.write(FileOutputStream.java:282)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:51)
        at org.apache.hadoop.util.RunJar.unJar(RunJar.java:79)
        at org.apache.hadoop.util.RunJar.unJar(RunJar.java:53)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:162)

Outcomes