AnsweredAssumed Answered

volume locked by gfsck

Question asked by teitou on Aug 10, 2016
Latest reply on Aug 12, 2016 by mufeed

We are currently using M7 via Amazon Web Services. We executed gfsck to check one of our hadoop volumes

 

/opt/mapr/bin/gfsck rwvolume=hadoop_home

 

but cancelled the job by using kill ; the hadoop volume remains locked and is usable; e.g. when executing the hadoop command, the following error message was emitted.

 

 

hadoop@ip-10-134-181-109:~$ hadoop fs -ls /user/

 

2016-08-10 08:08:23,2061 ERROR Client fs/client/fileclient/cc/client.cc:2734 Thread: 9119 Failed to fetch attributes for volume hadoop_home, error 11

 

---

 

 

hbase(main):001:0> list

 

TABLE                                                                                                             

 

ERROR: java.io.IOException: doListTables() called for default path(/user/hadoop), but it does not exists.

 

Here is some help for this command:

List all tables in hbase. Optional regular expression parameter could

be used to filter the output. Examples:

  (snip)

 

We tried to restart the system but it didn't fix the issue.

 

Could you please advise what should be done to unlock the volume?

 

Many thanks.

 

Outcomes