AnsweredAssumed Answered

How could I purge or merge milions of files in HDFS ?

Question asked by anikad_ayman on Mar 2, 2018
Latest reply on Mar 24, 2018 by MichaelSegel

In our Datalake (Hadoop/Mapr/Redhat) we have a directory which contains more than 40M of files. We can't run a ls command.


I've tried to launch hadoop command "hadoop fs -getmerge" to merge the files, but I have a time out in its execution.


"hadoop fs -rm" don't work too .


Is there another way to view the contenent of this folder ? How could I purge old files from it without a scan ?


Thank you