AnsweredAssumed Answered

LD_LIBRARY_PATH and CLASSPATH in hadoop job

Question asked by ashish on Dec 9, 2012
Latest reply on Dec 12, 2012 by gera
I need to modify LD_LIBRARY_PATH JAVA_LIBRARY_PATH and CLASSPATH of a hadoop job before running a job at cluster. In LD_LIBRARY_PATH i need to add location of some jars which are required while running the job, As these jars are available at my cluster, similar with CLASSPATH.

I have a 3 NODE cluster, I need to modify this LD_LIBRARY_PATH and CLASSPATH for all the 3 data nodes in such a way that jars available at my cluster node at added to classpath, so that the following jar are available while running the job as i am avoiding jar distribution while running the job to use all ready available jar on cluster nodes.

1.I have tried modifying hadoop-env.sh to modify CLASSPATH

    export HADOOP_TASKTRACKER_OPTS="-classpath:/opt/oracle/oraloader-2.0.0-2/jlib/

but the above thing modify HADOOP_CLASSPATH not the CLASSPATH

2. For LD_LIBRARY_PATH and JAVA_LIBRARY_PATH i have tired adding given below property in mapred-site.xml as suggested a my place but that didn't work.

  < property >
    
   < name >mapred.child.env< /name >

   < value >JAVA_LIBRARY_PATH=/opt/oracle/oraloader-2.0.0-2/lib/< /value >

    < value >LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/oracle/oraloader-2.0.0-2/lib/< /value >

    < description>User added environment variables for the task tracker child processes. Example : 1) A=foo This will set the env variable A to foo 2) B=$B:c This is inherit tasktracker's B env variable.
    < /description>

  < /property>

I have also restarted my all 3 data nodes,all tasktrakers and 2 NAMENOdes. Still these variables are not set and my hadoop job is not able to find all those jar files required  for running the test.

Outcomes