We successfully setup SASDataLoader for hadoop and be able to run jobs using MapReduce, but upon submitting spark jobs. it continue to fail. Specifically the log says "LauncherMapper died".
"LauncherMapper died, check Hadoop LOG for job maprnfs:///job_##########"
We tried to look for the logs of this job but certainly failed. All I can provideare the available job info.
We also tried to run a sample word count using spark and it end up successfully.
Is this an Oozie issue?
attached are some logs(SAS and MapR Logs) about the job.
Any help is appreciated, currently we cant see errors on the logs. The only hint is launchermapper died.