AnsweredAssumed Answered

HBase export error - job initialization fails (possible security issue?)

Question asked by tc_dev on Sep 14, 2013
Latest reply on Sep 15, 2013 by tc_dev
I am trying to run HBase table export job and it fails to start.

The cluster has been previously switched from root to mapr user, but MapReduce has not been fully tested (till now). I suspect there is a problem with permissions somewhere, but not quite sure where to look next to troubleshoot this.

Below is the output/error. Any suggestions for how to fix this?

    13/09/15 02:44:26 INFO mapred.JobClient: Task Id : attempt_201308280054_0004_m_000003_0, Status : FAILED on node XXXX
    Error initializing attempt_201308280054_0004_m_000003_0: Job initialization failed (20). with output: Reading task controller config from /opt/mapr/hadoop/hadoop-0.20.2/conf/taskcontroller.cfg
    number of groups = 9
    main : command provided 0
    main : user is root
    Failed to create directory /tmp/mapr-hadoop/mapred/local/taskTracker/root - No such file or directory
    failed to initialize user directory
     at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(
     at org.apache.hadoop.mapred.TaskTracker$
     at Method)
     at org.apache.hadoop.mapred.TaskTracker.initializeJob(
     at org.apache.hadoop.mapred.TaskTracker.localizeJob(
     at org.apache.hadoop.mapred.TaskTracker$
    Caused by: org.apache.hadoop.util.Shell$ExitCodeException:
     at org.apache.hadoop.util.Shell.runCommand(
     at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(
     at org.apache.hadoop.mapred.LinuxTaskController.initializeJob(
     ... 7 more