AnsweredAssumed Answered

Spark Jobs failing

Question asked by dzndrx on Aug 1, 2017
Latest reply on Aug 15, 2017 by dzndrx

Hi Community, 

 

We successfully setup SASDataLoader for hadoop and be able to run jobs using MapReduce, but upon submitting spark jobs. it continue to fail. Specifically the log says "LauncherMapper died".

 

The job error is like this 

"LauncherMapper died, check Hadoop LOG for job maprnfs:///job_##########"

 

We tried to look for the logs of this job but certainly failed. All I can provideare the available job info.

We also tried to run a sample word count using spark and it end up successfully. 


Is this an Oozie issue? 

 

attached are some logs(SAS and MapR Logs) about the job.

 

Any help is appreciated, currently we cant see errors on the logs. The only hint is launchermapper died.

Attachments

Outcomes