AnsweredAssumed Answered

how to delete large number of files generated by Pig?

Question asked by shaka on Oct 25, 2013
Latest reply on Oct 31, 2013 by shaka
one of our users is splitting a hive table using Pig.
but the Pig job generates a huge number of files under the _temporary folder.

there are so many files that hadoop fs -rmr fails, I tried using the NFS mount and a "find . -type f -delete" and a good old "rm -rf _temporary" , but that hasn't gone anywhere so far .. has anyone had this issue before?

Outcomes