In our Datalake (Hadoop/Mapr/Redhat) we have a directory which contains more than 40M of files. We can't run a ls command.
I've tried to launch hadoop command "hadoop fs -getmerge" to merge the files, but I have a time out in its execution.
"hadoop fs -rm" don't work too .
Is there another way to view the contenent of this folder ? How could I purge old files from it without a scan ?